"The Search Engine Is The Most Powerful Source Of Mind Control Ever Invented..."

Authored by Asher Schechter via ProMarket.org,

The opening panel of the Stigler Center’s annual antitrust conference discussed the source of digital platforms’ power and what, if anything, can be done to address the numerous challenges their ability to shape opinions and outcomes present.

Google CEO Sundar Pichai caused a worldwide sensation earlier this week when he unveiled Duplex, an AI-driven digital assistant able to mimic human speech patterns (complete with vocal tics) to such a convincing degree that it managed to have real conversations with ordinary people without them realizing they were actually talking to a robot.

While Google presented Duplex as an exciting technological breakthrough, others saw something else: a system able to deceive people into believing they were talking to a human being, an ethical red flag (and a surefire way to get to robocall hell). Following the backlash, Google announced on Thursday that the new service will be designed “with disclosure built-in.” Nevertheless, the episode created the impression that ethical concerns were an “after-the-fact consideration” for Google, despite the fierce public scrutiny it and other tech giants faced over the past two months. “Silicon Valley is ethically lost, rudderless and has not learned a thing,” tweeted Zeynep Tufekci, a professor at the University of North Carolina at Chapel Hill and a prominent critic of tech firms.

The controversial demonstration was not the only sign that the global outrage has yet to inspire the profound rethinking critics hoped it would bring to Silicon Valley firms. In Pichai’s speech at Google’s annual I/O developer conference, the ethical concerns regarding the company’s data mining, business model, and political influence were briefly addressed with a general, laconic statement: “The path ahead needs to be navigated carefully and deliberately and we feel a deep sense of responsibility to get this right.”

A joke regarding the flawed design of Google’s beer and burger emojis received roughly the same amount of time.

Google’s fellow FAANGs also seem eager to put the “techlash” of the past two years behind them. Facebook, its shares now fully recovered from the Cambridge Analytica scandal, is already charging full-steam ahead into new areas like dating and blockchain.

But the techlash likely isn’t going away soon. The rise of digital platforms has had profound political, economic, and social effects, many of which are only now becoming apparent, and their sheer size and power makes it virtually impossible to exist on the Internet without using their services. As Stratechery’s Ben Thompson noted in the opening panel of the Stigler Center’s annual antitrust conference last month, Google and Facebook—already dominating search and social media and enjoying a duopoly in digital advertising—own many of the world’s top mobile apps. Amazon has more than 100 million Prime members, for whom it is usually the first and last stop for shopping online.

Many of the mechanisms that allowed for this growth are opaque and rooted in manipulation. What are those mechanisms, and how should policymakers and antitrust enforcers address them? These questions, and others, were the focus of the Stigler Center panel, which was moderated by the Economist’s New York bureau chief, Patrick Foulis.

The Race to the Bottom of the Brainstem

“The way to win in Silicon Valley now is by figuring out how to capture human attention. How do you manipulate people’s deepest psychological instincts, so you can get them to come back?” said Tristan Harris, a former design ethicist at Google who has since become one of Silicon Valley’s most influential critics. Harris, who co-founded the Center for Humane Technology, an organization seeking to change the culture of the tech industry, described the tech industry as an “arms race for basically who’s good at getting attention and who’s better in the race to the bottom of the brainstem to hijack the human animal.”

The proliferation of AI, Harris said, creates an asymmetric relationship between platforms and users. “When someone uses a screen, they don’t really realize they’re walking into an environment where there’s 1,000 engineers on the other side of the screen who asymmetrically know way more about their mind [and] their psychology, have 10 years about what’s ever gotten them to click, and use AI prediction engines to play chess against that person’s mind. The reason you land on YouTube and wake up two hours later asking ‘What the hell just happened?’ is that Alphabet and Google are basically deploying the best supercomputers in the world—not at climate change, not at solving cancer, but at basically hijacking human animals and getting them to stay on screens.”

This “fiduciary relationship,” in which one party is able to massively exploit the other, is best exemplified by Facebook, which is akin to a “psychotherapist who knows every single detail in your life, including the details of your inner life, in the sense that it doesn’t just know who you click on at two in the morning and what you post and your TINs and your photos and your family and who you talk to the most and who your friends are. It also intermediates every single one of your communications. It knows what colors your brain lights up to if I give you a red button or a green button or a yellow button. It knows which words activate your psychology. It knows an unprecedented amount of information about what will manipulate you. If there’s ever been a precedent or a need for defining something as being an asymmetric or fiduciary relationship, it’s this one.”

Facebook’s ad-based business model, Harris argued, is “obviously misaligned” with its asymmetric power. “Would you want to be paying that psychotherapist or would you want that psychotherapist to instantly take all that personal information about you, the most intimate details of your life, and then sell it to car salesmen?”

“The reason you land on YouTube and wake up two hours later [asking] ‘What the hell just happened?’ is that Alphabet and Google are basically deploying the best supercomputers in the world—not at climate change, not at solving cancer, but at basically hijacking human animals and getting them to stay on screens.”

It’s not that Silicon Valley lacks in goodwill, he said. In 2013 Harris, then a product manager at Google, prepared a presentation that argued that Google, while having the power to shape elections and societies, often exploits users’ psychological vulnerabilities instead of acting with their best interest in mind. The presentation went viral and got Harris promoted to the role of “design ethicist.”

Ultimately, though, the company quickly reverted to business as usual. The problem, said Harris, was the incentive to maximize users’ time and attention. “If you’re at YouTube, you’re incentivized to get people to spend time on videos, even if those videos are conspiracy theories. The product manager—25 years old, going to stay at YouTube for two years, went to a good school—their job is just to show on their resume that they made the engagement numbers on videos go up. Then you wake up two years later and YouTube has driven 15 billion views to Alex Jones’ videos. That’s not videos people sought out themselves. That’s actually YouTube driving the recommendation.”

“The Search Engine Is the Most Powerful Source of Mind Control Ever Invented”

Robert Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology in California and the former editor of Psychology Today, is one of only a few scholars who have conducted empirical studies on the ability of digital platforms to manipulate opinions and outcomes. In a 2015 study, Epstein and Ronald E. Robertson reported the discovery of what they consider “one of the largest behavioral effects ever identified”: the search engine manipulation effect (SEME). Simply by placing search results in a particular order, they found, voters’ preferences can shift dramatically, “up to 80 percent in some demographic groups.”

“What I stumbled upon in 2012 or early 2013 quite by accident was a particular mechanism that shows how you can shift opinions and votes once you’ve got people hooked to the screen,” said Epstein.

While much of the political and public scrutiny of digital platforms has been focused on the behavior of bad actors like Cambridge Analytica, Esptein called these scandals a distraction, saying, “Don’t worry about Cambridge Analytica. That’s just a content provider.” Instead, he said, the power of digital platforms to manipulate users lies in the filtering and ordering of information: “It’s no longer the content that matters. It’s just the filtering and ordering.” Those functions, he noted, are largely dominated by two companies: Google and, to a lesser extent, Facebook.

SEME, said Epstein, is but one of five psychological effects of using search engines he and his colleagues are studying, all of which are completely invisible to users. “These are some of the largest effects ever discovered in the behavioral sciences,” he claimed, “but since they use ephemeral stimuli, they leave no trace. In other words, they leave no trace for authorities to track.”

Another effect Epstein discussed is the search suggestion effect (SSE). Google, Epstein’s most recent paper argues, has the power to manipulate opinions from the very first character that people type into the search bar. Google, he claimed, is also “exercising that power.”

“We have determined, through our research, that the search suggestion effect can turn a 50/50 split among undecided [voters] into a 90/10 split just by manipulating search suggestions.”

One simple way to do this, he said, is to suppress negative suggestions. In 2016, Epstein and his coauthors noticed a peculiar pattern when typing the words “Hillary Clinton is” into Google, Yahoo, and Bing. In the latter, the autocomplete suggested searches like “Hillary Clinton is evil,” “Hillary Clinton is a liar,” and “Hillary Clinton is dying of cancer.” Google, however, suggested far more flattering phrases, such “Hillary Clinton is winning.” Google has argued that the differences can be explained by its policy of removing offensive and hateful suggestions, but Epstein argues that this is but one example of the massive opinion-shifting capabilities of digital platforms. Google, he argues, has likely been determining the outcomes of a quarter of the world’s elections in recent years through these tools.

“The search engine is the most powerful source of mind control ever invented in the history of humanity,” he said. “The fact that it’s mainly controlled by one company in almost every country in the world, except Russia and China, just astonishes me.”

Epstein declined to speculate whether these biases are the result of deliberate manipulation on the part of platform companies. “They could just be from neglect,” he said. However, he noted, “if you buy into this notion, which Google sells through its PR people, that a lot of these funny things that happen are organic, [that] it’s all driven by users, that’s complete and utter nonsense. I’ve been a programmer since I was 13 and I can tell you, you could build an algorithm that sends people to Alex Jones’s videos or away from Alex Jones’s videos. You can easily alter whatever your algorithm is doing to send people anywhere you want to send them. The bottom line is, there’s nothing really organic. Google has complete control over what they put in front of people’s eyes.”

A “Nielsen-type network” network of global monitoring, suggested Epstein, might provide a partial solution. Together with “prominent business people and academics on three continents,” he said, he has been working on developing such a system that would track the “ephemeral stimuli” used by digital platforms. By using such a system, he said, “we will make these companies accountable to the public. We will be able to report irregularities to authorities, to law enforcement, to regulators, antitrust investigators, as these various manipulations are occurring. We think long-term that is the solution to the problems we’re facing with these big tech companies.”

Is Antitrust the Solution?

In the past two years, a growing movement of scholars, policy wonks and politicians has argued that many of the challenges associated with digital platforms are related to market concentration and has favored increased antitrust enforcement, possibly even breaking platforms up, as a way to address their growing power. But is antitrust the best way to address things like addiction-enhancing business models? Kevin Murphy, a Chicago Booth economics professor, doesn’t think so.

“First off, most of what we’re talking about has nothing to do with concentration. These problems would exist absent concentration. Secondly, a focus on concentration as the bottom line of antitrust is misguided as well. The idea that we have some new world that doesn’t look like things we’ve seen before, I don’t know, I don’t see it,” said Murphy.

“Would this be an easier problem to solve if we had 100 firms out there all trying to influence people using these same methods? It might be a much more difficult regulatory process in that world. It might be difficult to measure, difficult to regulate. It’s not clear how this is related to the concentration issue per se,” he added.

Similar arguments about the ability of technology to manipulate elections, Murphy opined, were also made in the early days of television, and concerns over market power were heard during the early days of the Internet. “I remember the days where Yahoo was thought to have an insurmountable first-mover advantage in search, or [when] AOL had an insurmountable first-mover advantage in access to people’s eyeballs, or [Windows] Media Player was going to dominate digital music. The idea that we’re any good at predicting how these markets are going to move or any good at shaping how they’re going to move seems to me to be odd. It also seems like a poor use of antitrust.”

There is also the question of just what sort of impact digital platforms have on the economy as a whole. Contrary to their prominence in the political debate, noted Chad Syverson, also an economics professor at Chicago Booth, the rise of digital platforms coincided with a decade of historically low productivity growth. Tech firms often like to portray themselves as bucking this trend, but evidence of this has so far been slim. It is possible, said Syverson, “the brain space dedicated to these companies, right now at least, exceeds the economic space that they fill up. The entire information sector, which is all of telecom, all of broadcasting, publishing, online and off, and some other sectors, that’s less than five percent of GDP.”

For years, tech execs have fended off possible antitrust actions by claiming that their dominance is not a competition issue, utilizing Alphabet CEO and Google co-founder Larry Page’s argument that “competition is only one click away.” The problem that faces antitrust enforcers, argued Thompson, is that it’s “kind of true.”

“You can go to Bing. You can go to DuckDuckGo, which doesn’t track your information. You can go to other e-commerce sites. You can go to other social networks,” said Thompson. “The issue is that customers don’t want to. It’s not that they can’t. It’s that no one wants to go anywhere else.” The services that platforms offer are vastly superior to what came before them, and network effects mean they can offer an overall better user experience than any fledgling competitor. According to Thompson, this is the paradox antitrust enforcers have to contend with: “The bigger you are, the better you are, at least from a consumer perspective.”

Concentration Really Does Matter

Responding to Murphy, Yale University economics professor Fiona Scott Morton argued that while there have been similar concerns over technology’s ability to influence and manipulate in the past, the difference is the precision with which digital platforms can target users at the individual level.

Concentration, she said, is a relevant issue because of the massive influence currently held by a small number of actors. “If there were 30 search engines and everybody was evenly distributed across those 30 search engines and each one had a bias, we would not think that anyone of them was perhaps tipping an election. That’s the sense in which the concentration really does matter to the problems that we’re talking about.”

Responding to Syverson, Morton said that “it is a little misleading” to say platforms are only a small part of GDP. The influence of their technologies on the rest of the economy, she noted, exceeds their actual share of GDP.

While there are potential costs associated with regulating digital platforms, these are not necessarily larger than the benefits that would come from regulating them, Morton asserted. “Of course, we’re going to make a mistake, but we balance the mistakes of regulation. My photos aren’t shared quite as well as they might have been. The search term doesn’t come up quite as fast as it otherwise would because we’ve regulated the company away from innovating in that space. Then there’s the cost of not regulating, which is our democracy doesn’t work anymore, and we have to balance those two things. As a society, we’re having a national conversation about how that latter thing is a lot bigger than we thought it was before.”

Many of these challenges, Morton noted, are not strictly related to competition. When it comes to antitrust, she said, “there’s a little bit of a shortage of really tight theories of harm,” which is why, she said, antitrust cases against digital platforms have not moved forward. “There’s also a question of political will to bring those cases,” she acknowledged, but “even with political will, you have to have a really good explanation of how competition is being harmed.”

At the conclusion of the panel, Foulis asked the panelists whether platforms companies will be more or less powerful in 10 years. The panelists were divided. “In 10 years, I think the surveillance business model will have been made illegal,” said Epstein, whereas Morton argued that platforms will ultimately become more powerful. “I’m afraid that I believe that people with profit are really good at hanging onto their profit,” she said.

Thompson also believes platforms will be more powerful. However, he said, “I do think people like Tristan are the biggest threat to these companies. The reason is because their power accrues not from controlling railways or telephone wires. Their power accrues from people continually making affirmative choices to use their platforms. That’s what gives them monopsony power, The way I think ultimately that power will be undone is through the political process.”

Comments

Croesus Sun, 05/13/2018 - 15:31 Permalink

It's ALL "Programming", designed to subliminally influence peoples' thinking.

It's about knowing everything there is to know about individually, without knowing us.

It's about control. It's about power.

It's about reducing humanity to a pack of slaves, for the benefit of the owners.

It's a big club, and we ain't in it.

At the same time, we DO NOT have to comply.

Tallest Skil DownWithYogaPants Sun, 05/13/2018 - 15:46 Permalink

Most people on the Internet probably only come in contact with less than a dozen sites. Google, with its Gmail and YouTube, Facebook, perhaps a random community like Tumblr, a couple of image boards, the occasional visit to Amazon, maybe some news websites, and that’s about it. For the vast majority of the population, the Internet is a prepackaged, socially engineered spy grid. It fuels itself on your input and weaponizes the information against you and everyone else. Already the social engineers are dividing us entirely, confusing the tongue, and making it difficult to communicate effectively. On Google and YouTube, comments and videos are filtered such that you only come in contact with certain predetermined material derived by social algorithms. They make it nearly impossible to discover new random channels and points of view. When you click on a video and scroll down, you’re presented with preselected comments that jive with the opinions you tend to agree with and made to jump through hoops of inconvenience to look at all the other discussions taking place.

    Since Google is so influential, this sort of strategy is largely finding its way into every facet of the corporate-controlled Internet. This means that when I click on a video, say of the puppet Obama fake crying about Sandy Hook, I will see comments that are critical of his phony bullshit and other comments mocking the counterfeit brainwashing media. Yet when a stereotypical phony “liberal” feminist clicks on the same video, she’ll be presented with comments that agree with her gun-grabbing ideology. In effect, we’re being self-imprisoned on these tiny Internet islands where we can’t reach out to one another. Google can control who and what we interact with and see, and so divide and conquer the mind of the population. It’s a good strategy to quell dissent; when I click on a controversial news video or article, I unwillingly come in contact with opinions that tend to support my own, and so I leave with the sense that there is a consensus on a particular world event like Sandy Hook. This engineering of a false consensus has the effect of pacifying the people, making them content in their beliefs. In being content, they became lazy and stop questioning the world and discussing reality with those around them.

    By forcing the ignorant to be separate from the wise, from the stupid, from the trolls, even, this system of division is impeding the social development of humanity at large. The typical person on the Internet is confined within their own little bubble of information–a literal reservation matrix. The vast majority of modern people only interact with the world around them through the lens of the Internet. Everything they know–and much of where their worldview comes from–is directly influenced through what they experience online. By allowing a cabal of government/corporate entities with advanced technologies in their disposal to regulate what an individual interacts with online, they can shape and guide the development of one’s mind.

    We are, quite literally, being domesticated through sophisticated weaponized psychology.

    Most of human history and its accumulated knowledge is already immersed on the Internet; within our lifetimes all of it will be in the cloud, soon enough the entire population will be hardwired into the Internet, in one way or another. It’s conceivable that our entire species’ recorded collective experience–all of our history and knowledge–can be manipulated and censored by predatory algorithms that can gradually and insidiously edit the data to keep the truths from us. The beast supercomputers can sift through the entire Internet and gradually edit out certain sensitive or undesirable information–even change audio files and manipulate videos. In recent years, everyone’s identity is being lassoed to the Internet, such that there is no longer anonymity and free exchange. Certain people can be effectively silenced. The Internet with which I come into contact might be an entirely different Internet than the one others see. By socially engineering groups and confining certain people within these restricted informational reservations, reality and social/cultural trends can be manufactured. It’s such a passive and insidious strategy. Just as a virus entering a cell coats itself with the host’s own membrane, masquerading as self to elude detection, this beast computer consciousness uses our own information and our own architecture to elude our defenses and gain entrance into our collective mind.

 

War is peace, freedom is slavery, ignorance is strength.

In reply to by DownWithYogaPants

Sir Edge The First Rule Sun, 05/13/2018 - 16:10 Permalink

@Tallest Skil and @TheFirstRule --- Good comments...

I use Startpage.com and DuckDuckGo... for searches... 

I create a throwaway email address from an owned personal domain (with all proxy Admin Address Info for inquiries) for websites that MUST HAVE MY EMAIL ADDRESS... and forward them to my main email address and then kill them when i get the slightest spam... rinse and repeat... 

I only use Googie when i need to reverse a telemarketer phone number to find out WTF called (stuff it googie)

I do not allow my browser to do any tracking... by not allowing first party cookies or third party cookies tracking... and ALL cookies are purged when i close my browser... This creates a necessity to logon to a lot of sites at beginning of the day but with a auto password auto fill personal logon tool it is only one click to do this and i then (when this is required) know i have NOT been tracked from one day to the next... or concurrently.

I use HTTPS Everywhere browser add on to increase my encryption connection to some websites.

I use uBlock Origin browser add on as my AD blocker... i think it is better than others

I use a VPN where i can choose any of a number of multiple areas to cloak from for my IP address (that is not leaked by test) and my GEO location...

I have now found there is a throw away phone number service on the internet...  https://www.tossabledigits.com/features.php

I do not use (((Google))) or (((Facebook))) because they are owned by active Israel First Zionists... 

I assume the NSA knows everything that i am doing and gives every American citizen's data/info to Israeli/Mossad intelligence per their agreement to do so with Israel over ten years ago... PRISM

I do the best i can...  :o)

 

PS... If every 2 weeks you do ONE thing to increase your privacy by the end of six months you would be beyond what i listed above...

 

In reply to by The First Rule

Sir Edge venturen Sun, 05/13/2018 - 17:15 Permalink

 @venturen

I am between FireFox / Chrome / Opera... trying to decide which one to use... Chrome works VERY hard to track you in various ways including their little logon button at the top of their browser next to their minimize button on Chrome... the little man login that you have to purposely log OUT of when you install Chrome browser.

I am on a windows platform and I am not happy about the Intel Kernal vulnerabilities that have been reported as of late by the tech media in late 17' or early 18'... Intel hardware, at that level i THINK, is machined by Israeli companies FOR Intel... not sure but that was the skinny i heard back when ??

I use Malwarebytes and Microsoft Security Essentials... for AV.

Not trying to hide from NSA... good luck with that... 

Just doing the best to protect myself from Hackers etc. from EVERYWHERE and from EVERYBODY else...

 

Venturen... You have any other suggestions ?

In reply to by venturen

Sir Edge loveyajimbo Sun, 05/13/2018 - 20:22 Permalink

@loveyajimbo

I know that Avast Pro AV has good reviews in the past but not sure over the last two years... Microsoft put out for FREE a very good pro AV software Microsoft Security Essentials... Which they put a lot of work into, for windows platform, so MICROSOFT would get a lot LESS virus phone calls to their support line... So they built MSE to lower their support costs... I use that plus Malwarebytes and seems to work fine... 

But you can check out the lastest reviews on Avast Pro to see if you want to go that way...

Good Luck...   :o)

In reply to by loveyajimbo

GreatUncle Sir Edge Sun, 05/13/2018 - 16:52 Permalink

+1 I also use two routers, the first DMZ onto the second seems to stop google being able to authenticate = blocked.

Yes to the email and I use a personal email so the emails never sit in a public email server = 0 data mining.

It must be caught by GCHQ / NSA on the internet backbone as it passes through not mined by some ISP mail server.

Encryption will come soon enough because Amazon cloud servers are now being used to capture the traffic and only encryption will maintain your privacy.

In reply to by Sir Edge

Encroaching Darkness The First Rule Sun, 05/13/2018 - 18:18 Permalink

Yep.

When they started, the search engines would show you ALL the hits. Pages and pages, often over a hundred pages of hits. You could dig as deep as you wanted, and find people who didn't agree with each other, with the national news, with the people you live with. You might even compare and contrast one against another, and possibly even figure out some truth.

Now Google gives you 13 pages or so; you have to enter another search term to get more / different. And with their new censorship powers, you won't see anything THEY DON'T WANT YOU TO. You won't find ideas DIFFERENT FROM THEIRS, and you won't be able to search out STORIES THAT CONTRADICT THE NATIONAL NEWS.

THIS is why you need to use DuckDuckGo, RedTube / Dtube, Gab.ai (instead of twittler), and so on. REFUSE their amateurish attempts at mind control, and vary your online shopping habits. 

Screw them before they screw you up. 

In reply to by The First Rule

shining one bamawatson Mon, 05/14/2018 - 06:27 Permalink

Believe it or not, Snopes is an excellent web site for research. Not for finding the truth of course, but for finding out other areas where we are being lied to. They are basically giving us a huge list of everything they have their evil tentacles manipulating. Not only that, but hints at where to start your research.

 

Use there own weapons against them. 

In reply to by bamawatson

scam_MERS Croesus Sun, 05/13/2018 - 15:52 Permalink

I have never used FakeBook, never will. I use DDG and Startpage for internet searches, and only use Amazon for reviews (and even less now, since many are fake). I even talked the wife into closing her Odnoklassniki account (Russian equivalent of FB). I do 99% of my shopping locally, and we grow quite a few of our own veggies. We're doing our part to stay out of the Matrix, although we're always looking for more things to opt out of.

In reply to by Croesus

stefan-coast scam_MERS Sun, 05/13/2018 - 18:01 Permalink

I like startpage also..whenever I type in  a sunject I am looking for, startpage does not "finish my senetence for me"  I hate that sometimes..I type something in and the computer finishes my statement, altho what they think is not what I am thinking , so I have to backspace and type in what I want...I hate being told "what to do"...I will figure it our on my own, thank you very much :-)  

In reply to by scam_MERS

Brazen Heist Sun, 05/13/2018 - 15:34 Permalink

The problem is, people are volunteering their own data and becoming complicit in their own undoing. Kinda like dumbocracy.

When the ruling class herds the sheep into demanding their chains, that's total subjugation of the worst kind because it kills the fighting spirit and pacifies man into content plantation subject.

Brazen Heist Robot Traders Mom Sun, 05/13/2018 - 15:52 Permalink

A collective of voyeurs and narcissists......sums up social media quite well.

I'm in the process of downsizing my digital footprint, even considering going back fully to my old "dumb phone" that I have around, now that the "smart" phone battery can barely last a day without me even using it that much.

I guess some of us do not wish to take a big part in this kind of society and are happily content living in our own worlds. And I'm definitely no Luddite, quite the opposite, I am very tech savvy. I'm just highly selective about my gear, and like to live efficiently, rather than plastering my life all over the web. If you're too busy achieving things in life, you don't really have time for voyeurism and narcissism.

ZH however, is one of the best hangouts on the web and worth the time.

In reply to by Robot Traders Mom

Chupacabra-322 Sun, 05/13/2018 - 15:40 Permalink

Predictive Programming

 

The theory proposes that public media (such as films, television, news casts, etc.) are deliberately seeded with subtle clues to future social, political, or technological changes. According to the theory, when the relevant change is later introduced into the world, the public has become used to the idea through exposure, and therefore passively accepts it rather than offering resistance or opposition. Predictive programming is therefore thought to be a means of propaganda or mass psychological conditioning that operates on a subliminal or implicit level.

 

– Wikipedia, Predictive Programming

Brazen Heist Chupacabra-322 Sun, 05/13/2018 - 15:43 Permalink

I don't know what its called, maybe you could call it behavioural pre-crime screening, but some shops now run certain algorithms on their CCTV cameras to screen for pre-crime like shoplifting. If someone is standing too long in one spot or walking in certain ways, it will trigger a security alert.

It begins with little things like that, that fly under the radar for most people.

In reply to by Chupacabra-322

Brazen Heist messystateofaffairs Sun, 05/13/2018 - 15:45 Permalink

Oh its here to stay alright. And what do we know is going to happen? People are going to fuck things up with it, abuse it for personal gain over others, and weaponize it.

That's what happens when chimps in suits are in power.

Add to the list of ideas/technology that got hijacked and abused:

- Nuclear power

- Democracy

- Capitalism

- Communism

- Political Correctness

- Liberalism

- Nationalism

And so on.

The question is, how long before they screw up the Internet?

In reply to by messystateofaffairs

vulcanraven Sun, 05/13/2018 - 15:41 Permalink

I've been saying this forever, watching arguments play out all over social media has basically come down to "my Google skills are better than yours"

It is rare that people actually posses true knowledge anymore, the search engine has completely eroded their desire to retain information.