VIDEO: Pedagogical Workshop on "Web Scraping"
Sociology graduate student Forrest Gregg explains techniques for gathering data sets from web sites.
Social scientists increasingly have access to data sets of unparalleled scope and complexity. New technologies have made the collection of that data - by governments, private companies, or innovative researchers - possible. Advances in computer science and statistics have allowed for inferential, simulated, and visual analyses that are only now being incorporated into faculty research.
For those new to the social sciences, this is an opportunity to see where your computer science and statistical skills can go, with innovative applications to problems of massive societal interest.
For those new to computational methods, this is a chance to develop the tools necessary to make new and exciting contributions, tools that will shape the originality and power of your work for years to come.
With access to the full resources and faculty of the University of Chicago, in a small cohort that is faculty-mentored and assisted by prize-winning doctoral “preceptors,” you will be trained as a colleague and contributor for the next great wave of social science research.
Richard Evans, Senior Lecturer in Computational Social Science and Fellow at the Becker Friedman Institute discusses the immense potential in the methods, practices, and even workflows that computer engineers have implemented in their own discipline, and is working to bring those skills into Chicago economics through his role both at the institute and via the Masters in Computational Social Science program.
Evans spoke to BFI about how we can expect to see computation shape different aspects of economic study, as well as the ways that computer scientists and software engineers can teach economists how to work smarter. Listen to the podcast on Soundcloud>>