COMMITTED to train men and women
to have minds for the Lord Jesus,
hearts for the truth, and
hands that are skilled to the task.

03.30.2022 – INFORMATION – Part Three | Lesson 8

2 Timothy 3:1-5
Matthew 24:3-6

In 2019, Douglas Rushkoff published the book titled: “TEAM HUMAN”
We are naïve to think that digital technology would be intrinsically and inevitably more empowering than any medium that came before it. Yes, digital networks are more directionless and decentralized than their broadcast predecessors. They allow messages to flow from the bottom up, or the outside in. But, like all media, if they’re not consciously seized by the people seeking empowerment, they’ll be seized by someone or something else. WHOEVER CONTROLS MEDIA CONTROLS SOCIETY. Each new media revolution appears to offer people a new opportunity to wrest that control from an elite few and reestablish the social bonds that media has compromised. But, so far anyway, the people – the masses – have always remained one entire media revolution behind those who would dominate them.
For instance, ancient Egypt was organized under the presumption that the pharaoh could directly hear the words of the gods, as if he were a god himself. The masses, on the other hand, could not hear the gods at all; they could only believe. With the invention of text, we might have gotten a literate culture. But text was used merely to keep track of possession and slaves. When writing was finally put in service of religion, only the priest could read the text and understand the Hebrew or Greek in which they were composed. The masses could hear the Scripture being read aloud, thus gaining the capability of the prior era – to hear the words of God. But the priests won the elite capability of literacy. When the printing press emerged in the Renaissance, the people gained the ability to read, but only the king and his chosen allies had the power to produce texts. Likewise, radio and television were controlled by corporations or repressive states. People could only listen or watch. With computers came the potential to program. Thanks to online networks, the massed gained the ability to write and publish their own blogs and videos – but this capability, writing, was the one enjoyed by the elites in the prior revolution. Now the elites had moved up another level and were controlling the software through which all this happened.
Today, people are finally being encouraged to learn code, but programming is no longer the skill required to rule the media landscape. Developers can produce any app they want, but its operation and distribution are entirely dependent on access to the walled gardens, cloud servers, and closed devices under the absolute control of just three or four corporations. The apps themselves are merely camouflage for the real activity occurring on these networks; the hoarding of date about all of us by the companies that own the platforms. Just as with writing and printing, we believe we have been liberated by the new medium into a boundless frontier, even though our newfound abilities are entirely circumscribed by the same old controlling powers. At best, we are settling the wilderness for those who will later monopolize our new world. The problem with media revolutions is that we too easily lose sight of what it is that’s truly revolutionary. By focusing on the shine new toys and ignoring the human empowerment potentiated by these new media – the political and social capabilities they are retrieving – we end up surrendering them to the powers that be. Then we and our new inventions become mere instruments for some other agenda.”
Carl Teichrib “Game of Gods”
TRANS-WHAT?
“An old but accurate definition can be found in the 1883 edition of the Imperial Dictionary of the English Language: ‘Transhuman means beyond or more than human. A contemporary description might sound like this: ‘Transhumanism is humanity’s intentional evolution through science and technology. Lincoln Canon, then president of the Mormon Transhumanist Association, gave this definition in 2013: ‘Transhumanism is the ethical use of technology to expand our abilities from the human to the post-human.’ Transhumanism is thus a changeover, a stepping-stone, but not the final stage; it is a transition to a post-human potential, moving beyond what we presently are. This is a future-oriented vision, one fueled by incredible scientific and technical advances, and the possibilities they portend; greatly magnifying cognitive abilities, enhancing sensory input, genetic restructuring to permanently eliminate disease and weakness, finding ways to move our consciousness into a non-corruptible body, the extension of human life – to the point of immortality – and even RESURRECTING THE DEAD. A vast array of technologies and theoretical applications act like the carrot before the horse; Virtual and augmented reality, brain-computer interfacing and the anticipation of uploading one’s mind into an artificial carrier, cybernetics and chip implants, Artificial Intelligence, robotics and self-replicating machines, nano-technology, genetic manipulation, chemical switches for mood control and sharpened awareness, and cryonics for those who can afford to invest in a projected REAWAKENING. Using these technologies and predicting their impact on individuals and civilization, the offering of perfectibility – of forging an optimal species with near infinite capacity through the works of our hands – becomes more than just a tantalizing dream. IT BECOMES A FAITH. The other option is to remain as we are, reside in our limitations, struggle for a few decades and die. This is unacceptable. Thus, science becomes salvific with hope placed in the speculations of what technology may bring. Transhumanists, those who hold this promise of techno-futures, look to the Singularity with anticipation – that hypothetical point-in-time when information and technology outpaces humanity, forcing us to fully integrate into manageable matter. The Singularity will break our limitations of flesh and bone; Man, machine, and INFORMATION will merge into a new creation. Post-humanity is the anticipated result – our evolution BEYOND MAN:  UEBERMENSCH!! 
Max Moore, founder of Extropy Institute said:
“WE can use science and technology to understand the causes of aging and we can learn to eliminate those causes. It’s not an unsolvable problem. There’s nothing special about the human life span. It’s just an accident; an evolutionary accident….And why should we accept that? So really transhumanism is about taking control of our own human evolution, and deciding how long we want to live, how smart we want to be, how well modulated our emotions should be. It’s really about turning our choices over to us rather than natural selection.”

TEAM HUMAN” (Douglas Rushkoff)
“Living in a digitally enforced attention economy means being subjected to a constant assault of automated manipulation. Persuasive technology, as it’s now called, is a design philosophy taught and developed at some of America’s leading universities and then implemented on platforms from e-commerce sites and social networks to smartphones and fitness wristbands. The goal is to generate “behavioral change” and “habit-formation”, most often without the user’s knowledge or consent. Behavioral design theory holds that people don’t change their behaviors because of shifts in their attitudes and opinions. On the contrary, people change their attitudes to match their behaviors. In this model, we are more like machines than thinking, autonomous beings. Or at least we can be made to work that way. That’s why persuasive technologies are not designed to influence us through logic or even emotional appeals. This isn’t advertising or sales, in the tradition sense, but more like war-time psyops, or the sort of psychological manipulation exercised in prison, casinos, and shopping malls.
Just as the architects of those environments use particular colors, soundtracks, or lighting cycles to stimulate desired behavior, the designers of web platforms and phone apps use carefully tested animations and sound to provoke optimal emotional responses from users. Every component of a digital environment is tested for its ability to generate a particular reaction. Persuasive design also exploits our social conditioning. We evolved the need to be aware of anything important going on in our social circle. Not having the knowledge that a fellow group member is sick or angry could prove disastrous. In the compliance professional’s hands, this ‘FEAR OF MISSING OUT” provides direct access to our behavioral triggers. All we need are a few indications that people are talking about something to stimulate our curiosity and send us online and away from whatever we were really doing.
So designers put a red doth with a number on it over an app’s icon to make sure we know that something’s happening, comments are accumulating, or a topic is trending. 
If you refuse to heed the call, you may be the last person to find out. On the other hand, designers want to keep us in a state of constant disorientation – always scrolling and paying attention to something, but never so much attention that we become engrossed and regain our bearings. So they use interruption to keep us constantly moving from one feed to another, checking email and then social media, videos, the news and then a dating app. Each moment of transition is another opportunity to offer up another advertisement, steer the user toward something yet more manipulative, or EXTRACT DATA THAT CAN BE USED TO PROFILE AND PERSUDADE MORE COMPLETELY.”
“Instead of designing technologies that promote autonomy and help us make informed decisions, the persuasion engineers in charge of the biggest digital companies are hard at work creating interfaces that thwart our cognition and push us into an impulsive state where thoughtful choices – or though itself – are nearly impossible. We now know, beyond any doubt, that we are dumber when we are using smartphones and social media. We understand and retain less information, comprehend with less depth, and make decisions more impulsively than we do otherwise. This untethered mental state, in turn, makes us less capable of distinguishing the real from the fake, the compassionate from the cruel, and even the human form the nonhuman. Our real enemies, if we can call them that, are not just the people who are trying to program us into submission, but the algorithms they’ve unleashed to help them do it.
Algorithms don’t engage with us humans directly. They engage with the DATA we leave in our wake to make assumptions about who we are and how we will behave. Then they push us to behave more consistently with what they have determined to be our statistically most probable selves. They want us to be true to our profiles. Everything we do in our highly connected reality is translated into DATA and STORED FOR COMPARISON AND ANALYSIS. This includes not only which websites we visit, purchases we make, and photos we click on, but also real-world behaviors such as our diving styles and physical movements as tracked by mapping apps and GPS. Our smart thermostats and refrigerators all FEED DATA into our profiles. Most people worry about what specific information companies may record about us; we don’t want anyone to know the content of our emails, what we look at for kicks, or what sorts of drugs we take. That’s the province of crude web retailers who follow us with ads for things we’ve already bought.
Algorithms don’t care about any of that. The way they make their assessments of who we are and how to manipulate us has more to do with all the meaningless metadata they collect, compile, and compare. For instance, Joe may travel twelve miles to work, look at his text messages approximately every sixteen minutes, purchases fat-free cookies and watch a particular TV program two days after it airs. The algorithm doesn’t care about any of the specifics, nor does it attempt to make any logical conclusions about what kind of person Joe may be. All the algorithm cares about is whether this DATA allows it to put Joe in a statistical bucket along with other people like him, and if people in that bucket are likely to exhibit any similar behaviors in the future.
By crunching all these numbers and making constant comparisons between what we’ve done and what we do next, BIG DATA algorithms can predict our behavior with startling accuracy. Social media sites use the data they’ve collected about us to determine, with about 80 percent accuracy, who is about to get divorced, who is coming down with the flu, who is pregnant and who may consider a change in sexual orientation – before we know ourselves. Once algorithms have determined that Mary is, say, 80 percent likely to go on a diet in the next three weeks, they will fill her feeds with messages and news content about dieting; “FEELING FAT?” Some of these messages are targeted marketing, paid for by the site’s various advertisers. But the purpose of the messaging isn’t just to sell any particular advertiser’s products.
THE DEEPER OBJECTIVE IS TO GET USERS TO BEHAVE MORE CONSISTENTLY WITH THEIR PROFILES AND THE CONSUMER SEGMENT TO WHICH THEY’VE BEEN ASSIGNED.”
“Algorithms use our past behavior to lump us into statistical groups and then limit the range of choices we make moving forward. If 80 percent of people in a particular BIG-DATA segment are already planning to go on a diet or get divorced, that’s fine. But what of the other 20 percent? What were they going to do instead? What sorts of anomalous behavior, new ideas, or novel solutions were they going to come up with before they were persuaded to fall in line? In many human enterprises, there’s a tendency toward the Pareto principle, or what’s become know as the 80/20 rule; 80 percent of people will behave rather passively, like consumers, but 20 percent of people will behave more actively or creatively. For example, 80 percent of people watching videos online do only that; 20 percent of them make comments or post their own. While 80 percent of kids play games as they were intended to be played, 20 percent of kids modify them or create their own levels.  The people in the 20 percent open up new possibilities. We are using algorithms to eliminate that 20 percent; the anomalous behaviors that keep people unpredictable, weird, and diverse. And the less variation among us – the less variety of strategies and tactics – the less resilient and sustainable we are as a species. Survivability aside, we’re also less interesting, less colorful, and less human.  Our irregular edges are being filed off.  We develop computer algorithms that constantly advance IN ORDER TO MAKE HUMAN BEINGS MORE PREDICTABLE AND MACHINELIKE.”