By Rob Mitchum // December 21, 2012
Last week, we announced the newest CI research center, the Urban Center for Computation and Data (UrbanCCD). Led by CI senior fellow Charlie Catlett, the center will bring the latest computational methods to bear on the question of how to intelligently design and manage large and rapidly growing cities around the world. With more cities, including our home of Chicago, releasing open datasets, the UrbanCCD hopes to bring advanced analytics to these new data sources and use them to construct complex models that can simulate the effects of new policies and interventions upon a city’s residents, services and environment.
Since the announcement, news outlets including Crain’s Chicago Business, RedEye and Patch have written articles about UrbanCCD and its mission. The center was also highlighted by UrbanCCD collaborators at the School of the Art Institute of Chicago and Argonne National Laboratory, and endorsed by US Rep. Daniel Lipinski.
For more examples of what cities are doing with open data releases and the applications built upon those data sets, see The Best Open Data Releases of 2012 as decided by Atlantic Cities or WBEZ’s breakdown of the potential for Chicago’s new open data policy.
OTHER NEWS IN COMPUTATIONAL SCIENCE
Argonne posted this great video featuring the work of CI fellows Salman Habib and Katrin Heitmann using supercomputers to understand the role of dark energy and dark matter in the universe. The project is simulating the universe from its formation to the modern day, a massive effort which will require the use of Mira, one of the world’s fastest supercomputers. The team’s research was named a finalist for the 2012 Gordon Bell Prize, an award recognizing outstanding achievement in high-performance computing.
If you want to relive or look back at the early days of computing, the Internet Archive has scans of Creative Computing magazine, which was published from October 1974 until December 1985. The advertisements alone are a humorous reminder of how far the field has come in the last forty years — one 1980 ad trumpets a large, boxy external hard drive that provides up to 591K bytes of data storage.
How do you write 800,000 books and put them up for sale on Amazon? This programmer wrote an algorithm that can create a new book on topics ranging from Webster’s Slovak – English Thesaurus Dictionary to The World Market for Rubber Sheath Contraceptives (Condoms): A 2007 Global Trade Perspective in 20 minutes, a patented system described by SingularityHub and ExtremeTech.
‘Tis the season for year-end lists, and HPCWire contributes to this December tradition with a list of 2012 hits and misses in the field of high-performance computing.
Sasha Issenberg and MIT Technology Review put together a lengthy, three-part examination of how the Obama campaign used data analytics to help win the 2012 election, and how big data methods can paradoxically make future politics more local.