This was such an obvious bullshit passage to anyone who knows anything about public data. The full excerpt from the book:
The syndicate, Brick explains, is using some of the same algorithmic and analytic technologies popularized by Google, Amazon, and Facebook, its computers analyzing thousands of data points every hour — information drawn from federal census records, online court documents, news stories, public health statistics, and government databases. Among their many uses, these algorithms track policing patterns in hundreds of U.S. neighborhoods, then steer BGF dealers clear of streets where cops are concentrated. They also identify what Brick calls “hot markets” — places where young, low-income African Americans are clustered — as destructive a tool that’s ever existed in the urban drug game, authorities say.
No, that data does not exist across hundreds (or even dozens) of jurisdictions in a form that would be easily collectable for an organization whose main purpose is running a drug-delivery service while fighting a gang war. Nor are those data sources real time enough to be actionable. Look how difficult and resource intensive it was for the Washington Post to retroactively track officer-involved fatal shootings — it’s not a problem that even the underground’s best hackers can overcome through code alone.
But the most obviously bullshit-smelly part is in the first sentence:
The syndicate, Brick explains, is using some of the same algorithmic and analytic technologies popularized by Google, Amazon, and Facebook, its computers analyzing thousands of data points every hour
So much data! That amount of data processing power might sound remotely impressively to a newspaper reporter from the 1940s, or 1840s. But it’s the equivalent of Elon Musk bragging about his new car as traveling “Several thousands of centimeters in just a single hour!”
Deutsch claims he depended on his sources to describe their technical feats, so we can assume that they would, if anything, oversell the amount of data crunching they do.