A machine-learning census of America’s cities
“WOULD it not be of great satisfaction to the king to know, at a designated moment every year, the number of his subjects?” A military engineer by the name of Sébastien le Prestre de Vauban posed this question to Louis XIV in 1686, pitching him the idea of a census. All France’s resources, the wealth and poverty of its towns and the disposition of its nobles would be counted, so that the king could control them better.
These days, such surveys are common. But they involve a lot of shoe-leather, and that makes them expensive. America, for instance, spends hundreds of millions of dollars every year on a socioeconomic investigation called the American Community Survey; the results can take half a decade to become available. Now, though, a team of researchers, led by Timnit Gebru of Stanford University in California, have come up with a cheaper, quicker method. Using powerful computers, machine-learning algorithms and mountains of data collected by Google, the team carried out a crude, probabilistic census of America’s cities in just two weeks.
First, the researchers trained their machine-learning model to recognise the make, model…Continue reading
Source: Economist