By Jonathan Harris
On a small scale, the effects of software are benign. But at large companies, institutions, and agencies with hundreds of millions of users, something so apparently small as the choice of what should be a default setting has an immediate impact on the daily behavior patterns of a large percentage of the planet’s population.
Click on the button below to download this article.
On the web, where people have learned not to value things directly, the most common business model is to make a product, give it away for free, attain tremendous scale, and then, once you have a lot of users, turn those users into the product by selling their attention to advertisers and their personal information to marketing departments.
This is a dangerous deal—not necessarily in economic terms, but in human terms— because, once the user has become the product, the user is no longer treated as an individual but as a commodity, and not even a precious commodity but one insignificant data point among many—a rounding error, meaningful only in aggregate. Thinking of humans this way produce sociopathic behavior: rational in economic terms, but very bad in human terms.
The bad medicine is strong and corrupting, for it can make companies and the people who run them incredibly rich, but we have to resist the bad medicine, because humanity doesn’t need any more addictions. Big Data is powerful, but it is ethically neutral; we have to choose how to use it.
This essay is taken from the book: Human Face of Big Data
Jonathan Harris makes projects that reimagine how humans relate to technology and to each other. Combining elements of computer science, anthropology, visual art and storytelling, his projects range from building the world’s largest time capsule (with Yahoo!) to documenting an Alaskan Eskimo whale hunt on the Arctic Ocean (with a warm hat).