Winter Quarter  

Perspectives on Computational Modeling

(MACS 30100). Rick Evans, M/W 11:30-1:20 p.m. & Weekly Lab Wednesdays 4:30-5:20 p.m.

Students are often well trained in the details of specific models relevant to their respective fields. This course presents a generic definition of a model in the social sciences as well as a taxonomy of the wide range of different types of models used. We then cover principles of model building, including static versus dynamic models, linear versus nonlinear, simple versus complicated, and identification versus overfitting. Major types of models implemented in this course include systems of nonlinear equations, linear and nonlinear regression, supervised learning (decision trees, random forests, support vector machines, etc.), and unsupervised learning. We will also explore the wide range of computational strategies used to estimate models from data and make statistical and causal inference. Students will study both good examples and bad examples of modeling and estimation.

MA Research Commitment

(MACS 35000). James Evans, By Arrangement.

Student Initiated research and writing for the MA research component. Open only to MACSS students.

Structural Estimation

(MACS 40200). Rick Evans, M/W 1:30-2:50 p.m.

Structural estimation refers to the estimation of model parameters by taking a theoretical model directly to the data.(This is in contrast to reduced form estimation, which often entails estimating a linear model that is either explicitly or implicitly a simplified, linear version of a related theoretical model). This class will survey a range of structural models, then teach students estimation approaches including the generalized method of moments approach and maximum likelihood estimation. We will then examine the strengths and weaknesses of both approaches in a series of examples from the fields of economics, political science, and sociology. We will also learn the simulated method of moments approach. We will explore applications across the social sciences.

Introduction to Causal Inference

(MACS 51000). Guanglei Hong, Kazuo Yamaguchi, and Fang Yang, Tuesdays 2-4:50 p.m. & Weekly Lab Fridays 1:30-2:50 p.m.

This course is designed for graduate students and advanced undergraduate students from the social sciences, education, public health science, public policy, social service administration, and statistics who are involved in quantitative research and are interested in studying causality. The goal of this course is to equip students with basic knowledge of and analytic skills in causal inference. Topics for the course will include the potential outcomes framework for causal inference; experimental and observational studies; identification assumptions for causal parameters; potential pitfalls of using ANCOVA to estimate a causal effect; propensity score based methods including matching, stratification, inverse-probability-of-treatment-weighting (IPTW), marginal mean weighting through stratification (MMWS), and doubly robust estimation; the instrumental variable (IV) method; regression discontinuity design (RDD) including sharp RDD and fuzzy RDD; difference in difference (DID) and generalized DID methods for cross-section and panel data, and fixed effects model. Intermediate Statistics or equivalent such as STAT 224 is a prerequisite. This course is a pre-requisite for “Advanced Topics in Causal Inference” and “Mediation, moderation, and spillover effects.”

Computational Content Analysis

(MACS 60000). James Evans, Fridays 1:30-4:20 p.m.

A vast expanse of information about what people do, know, think, and feel lies embedded in text, and more of the contemporary social world lives natively within electronic text than ever before. These textual traces range from collective activity on the web, social media, instant messaging and automatically transcribed YouTube videos to online transactions, medical records, digitized libraries and government intelligence. This supply of text has elicited demand for natural language processing and machine learning tools to filter, search, and translate text into valuable data. The course will survey and practically apply many of the most exciting computational approaches to text analysis, highlighting both supervised methods that extend old theories to new data and unsupervised techniques that discover hidden regularities worth theorizing. These will be examined and evaluated on their own merits, and relative to the validity and reliability concerns of classical content analysis, the interpretive concerns of qualitative content analysis, and the interactional concerns of conversation analysis. We will also consider how these approaches can be adapted to content beyond text, including audio, images, and video. We will simultaneously review recent research that uses these approaches to develop social insight by exploring (a) collective attention and reasoning through the content of communication; (b) social relationships through the process of communication; and (c) social states, roles, and moves identified through heterogeneous signals within communication. 

The course is structured around gaining understanding and experimenting with text analytical tools, deploying those tools and interpreting their output in the context of individual research projects, and assessment of contemporary research within this domain. Class discussion and assignments will focus on how to use, interpret, and combine computational techniques in the context of compelling social science research investigations.  

Computational Social Science Workshop

(MACS 50000). James Evans, Thursdays 11-12:20 p.m. Saieh 247. PQ: Computation students must register for a R. Other faculty and graduate students welcome.

High performance and cloud computing, massive digital traces of human behavior from ubiquitous sensors, and a growing suite of efficient model estimation, machine learning and simulation tools are not just extending classical social science inquiry, but transforming it to pose novel questions at larger and smaller scales. The Computational Social Science (CSS) Workshop is a weekly event that features this work, highlights associated skills and data, and explores the use of CSS in the world. The CSS Workshop alternates weekly between research workshops and professional workshops. The research workshops feature new CSS work from top faculty and advanced graduate students from UChicago and around the world, while professional workshops highlight useful skills and data (e.g., machine learning with Python’s scikit-learn; the Twitter firehose API) and showcase practitioners using CSS in the government, industry and nonprofit sectors. Each quarter, the CSS Workshop also hosts a distinguished lecture, debate and dinner, and a student conference.

Computer Science with Applications – 2

(CAPP 30122). Anne Rogers, M/W/F 9:30-10:20 a.m. & Weekly Lab Tuesdays at (1) 3-4:20 p.m., (2) 4:30-5:50 p.m., or (3) 6-7:20 p.m.

This course is the second in a three-quarter sequence that teaches computational thinking and skills to students in the sciences, mathematics, economics, etc. Lectures cover topics in (1) data representation, (2) basics of relational databases, (3) shell scripting, (4) data analysis algorithms, such as clustering and decision trees, and (5) data structures, such as hash tables and heaps. Applications and datasets from a wide variety of fields serve both as examples in lectures and as the basis for programming assignments. In recent offerings, students have written a course search engine and a system to do speaker identification. Students will program in Python and do a quarter-long programming project.

Mathematics for Computer Science and Data Analysis

(CAPP 30255). Amitabh Chaudhary, M/W/F 2:30-3:20 p.m.

This course develops the mathematical foundations in discrete mathematics and linear algebra that are broadly used in computer science, particularly in algorithms, databases, machine learning, and data analysis. // The topics from discrete mathematics are essential for developing computational thinking, particularly for design and analysis of algorithms. These include logic, proofs, big-O notation, recursion, induction, and counting. // In linear algebra, this course covers vectors and vector spaces, how matrices represent linear transformation of vectors, and their relationship to solving systems of linear equations. We study determinants, matrix inverses, projections, and finally, how eigenvectors and eigenvalues allow us to decompose a matrix into simpler matrices. These concepts underlie several techniques in machine learning and data analysis, such as those for dimensionality reduction.

Statistical Theory and Methods - 1

(STAT 24400). Wei Biao Wu, T/Th 9:30-10:50 a.m. PQ: Multivariate calculus. Some previous experience with statistics and/or probability helpful but not required.

This course is the first quarter of a two-quarter systematic introduction to the principles and techniques of statistics, as well as to practical considerations in the analysis of data, with emphasis on the analysis of experimental data. This course covers tools from probability and the elements of statistical theory. Topics include the definitions of probability and random variables, binomial and other discrete probability distributions, normal and other continuous probability distributions, joint probability distributions and the transformation of random variables, principles of inference (including Bayesian inference), maximum likelihood estimation, hypothesis testing and confidence intervals, likelihood ratio tests, multinomial distributions, and chi-square tests. Examples are drawn from the social, physical, and biological sciences. The coverage of topics in probability is limited and brief, so students who have taken a course in probability find reinforcement rather than redundancy. Students who have already taken STAT 25100 may choose to take STAT 24410 (if offered) instead of STAT 24400.

Statistical Theory and Methods - 2

(STAT 24500). Chao Gao, T/Th 9:30-10:50 a.m. PQ: Multivariate calculus and linear algebra and STAT 24400 or STAT 24410.

This course is the second quarter of a two-quarter systematic introduction to the principles and techniques of statistics, as well as to practical considerations in the analysis of data, with emphasis on the analysis of experimental data. This course continues from either STAT 24400 or STAT 24410 and covers statistical methodology, including the analysis of variance, regression, correlation, and some multivariate analysis. Some principles of data analysis are introduced, and an attempt is made to present the analysis of variance and regression in a unified framework. Statistical software is used.

Analysis in Rn I

(MATH 20300). Instructor TBD, M/W/F 10:30-11:20 a.m. PQ: MATH 16300 or MATH 15910 or MATH 15900 or MATH 19900.

For students concentrating in Computational Economics who need exposure to real analysis. Students must be proficient in linear algebra. This course covers the construction of the real numbers, the topology of R^n including the Bolzano-Weierstrass and Heine-Borel theorems, and a detailed treatment of abstract metric spaces, including convergence and completeness, compact sets, continuous mappings, and more.

Analysis in Rn II

(MATH 20400). Instructor TBD, M/W/F Section 31: 10:30-11:20 a.m., Section 41: 11:30- 12:20 p.m., or Section: 51 12:30-1:20 p.m. PQ: MATH 20700 OR MATH 20300 AND MATH 20250 or STAT 24300.

For students concentrating in Computational Economics who have taken MATH 20300. This course covers differentiation in R^n including partial derivatives, gradients, the total derivative, the Chain Rule, optimization problems, vector-valued functions, and the Inverse and Implicit Function Theorems.

Methods in Computational Neuroscience

(CPNS 34231). Silvan Bensmaia, M/W 3:30-4:50 p.m., Fridays 1:30-2:50 p.m., & Weekly Lab Wednesdays 9:30-11:20 a.m.  PQ: PSYC 36210 and PSYC 36211 which must be taken concurrently, or consent of instructor.

Topics include (but are not limited to): Hodgkin-Huxley equations, Cable theory, Single neuron models, Information theory, Signal Detection theory, Reverse correlation, Relating neural responses to behavior, and Rate vs. temporal codes.

Foundations of Computational Data Analysis

(MPCS 53110). Geraldine Brady, Days/Times TBD. PQ: Core Programming. B or better MPCS 50103 or passing score on math placement exam. Non-MPCS student must meet prerequsites and complete complete Course Request Form.

This course covers basic statistics and linear algebra, and programming in R. Topics in statistics include discrete and continuous random variables, discrete and continuous probability distributions, variance, covariance, correlation, sampling and distribution of the mean and standard deviation of a sample, central limit theorem, confidence intervals, maximum likelihood estimators, hypothesis testing, linear and multiple regression. Topics in linear algebra include Gaussian elimination, matrix transpose and matrix inverse, eigenvectors and eigenvalues, singular value decomposition.

Theoretical Neuroscience: Network Dynamics and Computation

(CPNS 35520). Nicolas Brunel, Days/Times TBD.

This course is the second part of a three-quarter sequence in theoretical/computational neuroscience. It will focus on mathematical models of networks of neurons. Topics will include: firing rate models for populations of neurons; spatially extended firing rate models; models of visual cortex; models of brain networks at different levels; characterization of properties of specific brain networks; models of networks of binary neurons, mean rates, correlations, reductions to rate models; learning in networks of binary neurons, associative memory models; models of networks of spiking neurons: asynchronous vs synchronous states; oscillations in networks of spiking neurons; learning in networks of spiking neurons; models of working memory; models of decision-making.

Machine Learning

(PLSC 43502). Justin Grimmer, M/W 11:30-1:20 p.m.

This course introduces techniques to collect, analyze, and utilize large collections of data for social science inferences. The ultimate goal of the course is to introduce students to modern machine learning techniques and provide the skills necessary to apply the methods widely. In achieving this ultimate goal, students will also: 1) Learn about core concepts in machine learning and statistics, developing skills that are transferable to other types of data and inference problems. 2) Develop their programming abilities in R and be introduced to Python. 3) Be introduced to substantive problems.

Advanced Topics in Biological Psychology

(PSYC 40300). Leslie Kay, Thursdays 2-4:50 p.m.

What are the relations between mind and brain? How do brains regulate mental, behavioral, and hormonal processes; and how do these influence brain organization and activity? This course provides an introduction to the anatomy, physiology, and chemistry of the brain; their changes in response to the experiential and sociocultural environment; and their relation to perception, attention, behavior, action, motivation, and emotion. PQ: Graduate standing and some sophistication with biological topics, including Neuroscience.

Mathematical Methods for Biological Sciences - 2

(PSYC 36211). Dmitry Kondrashov, T/Th 2-3:20 p.m. & Weekly Lab Fridays 3:30-5:20 p.m. PQ: PSYC 36210.

This course is a continuation of PSYC 36210. The topics start with optimization problems, such as nonlinear least squares fitting, principal component analysis and sequence alignment. Stochastic models are introduced, such as Markov chains, birth-death processes, and diffusion processes, with applications including hidden Markov models, tumor population modeling, and networks of chemical reactions. In computer labs, students learn optimization methods and stochastic algorithms, e.g., Markov Chain, Monte Carlo, and Gillespie algorithm. Students complete an independent project on a topic of their interest.


(CMSC 37000). Yury Makarychev, T/Th 9:30-10:50 a.m.

This is a graduate level course on algorithms, with the emphasis on computational problems that are central to both theory and practice, and on developing techniques for the design and the rigorous analysis of algorithms and data structures for such problems.

Network Analysis

(PLSC 57200). John Padgett, Mondays 1:30-4:20 p.m.

This seminar explores the sociological utility of the network as a unit of analysis. How do the patterns of social ties in which individuals are embedded differentially affect their ability to cope with crises, their decisions to move or change jobs, their eagerness to adopt new attitudes and behaviors? The seminar group will consider (a) how the network differs from other units of analysis, (b) structural properties of networks, consequences of flows (or content) in network ties, and (c) dynamics of those ties.

Game Theory II

(PLSC 31000). John Patty, T/Th 3:30-4:50 p.m.

This is a course for graduate students in Political Science. It introduces students to games of incomplete information through solving problem sets. We will cover the concepts of Bayes Nash equilibrium, perfect Bayesian equilibrium, and quantal response equilibrium. In terms of applications, the course will extend the topics examined in the prerequisite, PLSC 30901. Game Theory I to allow for incomplete information, with a focus on the competing challenges of moral hazard and adverse selection in those settings. 

Social Choice Theory

(PLSC 40801). Maggie Penn, T/Th 2-3:20 p.m.

This course will provide you with an introduction to the field of social choice theory, the study of aggregating the preferences of individuals into a "collective preference." It will focus primarily on classic theorems and proof techniques, with the aim of examining the properties of different collective choice procedures and characterizing procedures that yield desirable outcomes. The classic social choice results speak not only to the difficulties in aggregating the preferences of individuals, but also to the difficulties in aggregating any set of diverse criteria that we deem important to making a choice or generating a ranking. Specific topics we will cover include preference aggregation, rationalizable choice, tournaments, sophisticated voting, domain restrictions, and the implicit trade-offs made by game theoretic versus social choice theoretic approaches to modeling.