Cybernetics, Your Key to Living Forever


Though it may seem straight out of a sci-fi space film like Star Trek, the word “cybernetics” actually has origins in ancient Greece. The word is derived from kybernetes, which meant governor. Cybernetics is just that, governing or control of communication between man and machine by man or machine. It may have its roots in ancient times but cybernetics is the key driving force to the future of technology.

Simple Cybernetics

Cell Phones Are Changing Social Interaction.

Ira Hyman. Psychology Today, 11.27.16

Let’s start small, like hand held small. Almost every one on the planet has a cell phone. We use it everyday to text, call, take pictures with, or go on social media. It is our unessential essential because in reality we can live without it but, at the same time, can’t imagine our lives without it. Cell phones are the simplest form of cybernetics. We communicate with them and then they communicate with the cell network which communicates with the other cell. This happens hundreds of thousands of times a day everyday! Though the age of phone user and medium of communication does vary.

First the obvious finding. Age relates to big differences in how many text messages people send and receive each day. Young adults rely on text messages but older adults send and receive substantially fewer texts. In the over 50 group, more that 80 percent send and receive fewer than 10 texts each day. But young adults are texting much more every day. Interestingly, we found no difference in the number of cell phone calls made and received. Nobody is making very many — over 90 percent in every age group made fewer than 10 calls each day. The age difference in cell phone use is in texting. Young adults also use text messaging as their primary method of contacting friends — over 80 percent report texting as their preferred method. The percentage of people who use texting as their primary method of contacting friends drops in older age groups. Older adults (over age 50) prefer calling or email. Given the age difference in the number of texts, it shouldn’t be surprising that younger adults believe it is more appropriate to use their cell phones in a greater variety of situations than do older adults. Across the board, younger adults saw text messaging as more acceptable than older adults. (Hyman)

The evidence is very clear, the younger generation is the driving force and they are doing it with texting (don’t text and drive though). It seems that cybernetics and the new generation are helping each other grow and become more exposed to the world. In the block quote, it shows the older As the young generation grows into the old and a new generation arises so will a new level of cybernetics to go along with the rising generation.

Smart Cars

The Dangers of ‘Self-Driving’ Car Hype.

Scott Keogh. Wall Street Journal, 11.27.16

Not the small kind, the self-driving kind. While the idea is now a real concept in testing, it is still far from making its way onto our roads with everyone getting their own personal taxis. The concept is that a driver doesn’t actually have to drive. A computer governs the car to make turns and stops to reach the destination of the passengers. The reason we don’t see cars like these appearing all around us is of course the many dangers the present that they must first overcome such as human error. Self-driving cars can be as prepared as they come for accident prevention with a bunch of high tech cameras and sensors, but that would all mean nothing if a car that was driven manually by someone and they crashed into the smart car. Not so smart now it seems. There is hope that we may one day see self-driving cars

Audi offers this technology in the form of “traffic jam assist.” This function aids drivers during some of the most frustrating and stressful parts of their commute. It keeps them in their lane and allows them to take their hands off the wheel in slow-moving traffic for 15 seconds at a time. In 2018 Audi will deliver what we expect to be the first vehicle on the market equipped at Level 3, conditional automation. The A8 model will allow the driver to operate the vehicle hands-free under 35 miles an hour, when specific conditions are met. When the most advanced levels of automated vehicles are ready — starting with Level 4, high automation — drivers will be able to delegate the responsibilities of monitoring the environment to the vehicle. While that future won’t arrive until at least the next decade, there’s reason to look forward to it. (Keogh)

Though self-driving cars are the topic of the future. Car accessories are the topic of today. Almost every car produced in todays market is equipped with a touchscreen GPS or a parking assist sensor that beeps when something is close to the bumper. These forms of cybernetic communication help drivers commute more safely. However even with all these gadgets, drivers still get involved in accidents. Almost all the time, these accidents are caused by human error. whether it was answering a call, texting a friend, or even day dreaming about a delicious ham and turkey sandwich for lunch, all these are known as distracted driving. Thats the beauty in self-driving cars, there is no distracting a computer. It eliminates the biggest cause of automobile accidents, human error. I believe once self-driving cars are on the streets, there will be a lower count of automobile accidents. Guaranteed.

Artificial “Intelligence”

How We Became Posthuman

Katherine Hayles. University of Chicago Press, 11.27.16

Alan Turing’s famous “imitation game” goes like this:

You are alone in the room, except for two computer terminals flickering in the dim light. You use the terminals to communicate with two entities in another room, whom you cannot see. Relying solely on their responses to your questions, you must decide which is the man, which the woman. Or, in another version . . . you use the responses to decide which is the human, which the machine. One of the entities wants to help you guess correctly. His/her/its best strategy, Turing suggested, may be to answer your questions truthfully. The other entity wants to mislead you. He/she/it will try to reproduce through the words that appear on your terminal the characteristics of the other entity. Your job is to pose questions that can distinguish verbal performance from embodied reality. If you cannot tell the intelligent machine from the intel- ligent human, your failure proves, Turing argued, that machines can think. (Hayles)

This famous concept has created the basis for what is know as the Turing test. A test that determines if artificial intelligence is actually able to “think”. The ultimate goal of anyone working on creating artificial intelligence of the future is to pass this test and develop a program or machine that can really think for itself. In order to pass the test, the A.I. should be able to fool the human into thinking its thinks just like a human and can respond to questions asked without vague and misleading answers. Currently, we carry around a powerful A.I. in our pockets. Apple’s Siri on the iPhone is an excellent example of the most current form of artificial intelligence. Siri can answer simple questions, do unit conversions, look up the weather, tell you exactly what planes are flying overhead, and can even flip a coin for those tough decisions. However, even with the impressive and extensive list of things Siri can do, it is still not able to pass the Turing test. This is because Siri and so many other similar A.I. programs, like Amazon’s Alexa and Google’s Cortana, don’t actually “think” for themselves, they just pull information from a larger server to answer the questions asked. They would never be able to hold a conversation with a human and it is for that exact reason why they wouldn’t be able to fool a human and would fail the Turing test. Even with the impressive technology implicated in their programing and the marvelously massive amount of code each contains, our society is still far from a perfect A.I. Perhaps someday in the future we can all have a JARVIS like Tony Stark does in the Iron Man movies.

Iron Man talking to his A.I. Jarvis


From Touch Displays to the Surface: A Brief History of Touchscreen Technology

Florence Ion. ARS Technica, 11.27.16

Perhaps the most intimate way we interact and communicate with technology is through touch. Our phones have locks set to our individual and unique fingerprints. We tap and poke at our devices all day and they respond as they are programmed to. It is the most near perfect form of cybernetics we have as all the communication is but a simple tap. With that single tap, so much can be done. From money deposited to denying people you don’t know on Facebook, so many things can be done with such a minute movement. For those in the positions of power, a single tap on a screen could topple a country or government and start a war, possibly a world war. A little touch could have groundbreaking and earth shattering results. Touchscreen were not always available to us. The first devices with touchscreen did not appear until well past 2000 and they were pretty remarkable and effective.

Multitouch technology struggled in the mainstream, appearing in specialty devices but never quite catching a big break. One almost came in 2002, when Canada-based DSI Datotech developed the HandGear + GRT device (the acronym “GRT” referred to the device’s Gesture Recognition Technology). The device’s multipoint touchpad worked a bit like the aforementioned iGesture pad in that it could recognize various gestures and allow users to use it as an input device to control their computers. “We wanted to make quite sure that HandGear would be easy to use,” VP of Marketing Tim Heaney said in a press release. “So the technology was designed to recognize hand and finger movements which are completely natural, or intuitive, to the user, whether they’re left- or right-handed. After a short learning-period, they’re literally able to concentrate on the work at hand, rather than on what the fingers are doing.”(Ion)

Much of the touchscreen technology we use in our everyday lives stems from the failed attempts at making a once uncommon technology common for everyone to use. It seems the timing just was not right. The timing would be right, however, when tech world heavy hitter Microsoft came out with the Surface.

Before there was a 10-inch tablet, the name “Surface” referred to Microsoft’s high-end tabletop graphical touchscreen, originally built inside of an actual IKEA table with a hole cut into the top. Although it was demoed to the public in 2007, the idea originated back in 2001. Researchers at Redmond envisioned an interactive work surface that colleagues could use to manipulate objects back and forth. For many years, the work was hidden behind a non-disclosure agreement. It took 85 prototypes before Surface 1.0 was ready to go. As Ars wrote in 2007, the Microsoft Surface was essentially a computer embedded into a medium-sized table, with a large, flat display on top. The screen’s image was rear-projected onto the display surface from within the table, and the system sensed where the user touched the screen through cameras mounted inside the table looking upward toward the user. As fingers and hands interacted with what’s on screen, the Surface’s software tracked the touch points and triggered the correct actions. The Surface could recognize several touch points at a time, as well as objects with small “domino” stickers tacked on to them. (Ion)

Though simple in design, the prototype of the now popular tablet wasn’t much of a tablet as it was a giant table. But it got people thinking about the implications of touchscreens in modern life and soon they began to incorporate touchscreens into their everyday lives, starting with the phone.

Cybernetic Spinal Implants

Flexible Spinal Implants Help Paralyzed Rats Walk Again

Esther Hsieh. Scientific American, 11.27.16

We often take the little things in life for granted such as walking, moving our arms, or even talking. For some these simple and basic forms of movement and communication are difficult if not impossible. Car crash victims often end up paralyzed after their incident, making life much more difficult for them and those who must take care of them. The thought of being a burden to their loved ones can drive victims into a spiral of depression. Thankfully, we can implant cybernetic chips into spines to help paralyzed victims use their limbs again. A special spinal implant is being tested on rats that could one day lead to major breakthroughs for paraplegics everywhere.

The “e-dura” implant is made from a silicone rubber that has the same elasticity as dura mater, the protective skin that surrounds the spinal cord and brain . . . In an experiment that lasted two months, the scientists found that healthy rats with an e-dura spinal implant could walk across a ladder as well as a control group with no implant. Yet rats with a traditional plastic implant (which is flexible but not stretchable) started stumbling and missing rungs a few weeks after surgery. The researchers removed the implants and found that rats with a traditional implant had flattened, damaged spinal cords — but the e-dura implants had left spinal cords intact. (Hsieh)

Hopefully soon we will see people once bound to a wheelchair walking again so that they may play with their kids in the yard and participate in triathlons or even just go on hikes. The implications don’t stop at just the legs either. Spinal implants could also help those with paralysis in the arms or even full body paralysis. According to the website for Paralyzed Veterans of America, there are “approximately 100,000 veterans with a spinal cord injury.” Helping our veterans walk again would be just a small repayment for what they did for our country.

Brain Implants

Wirelessly Powered Brain Implant Could Treat Depression

Charles Q Choi. Live Science, 11.27.16

Moving a little higher up the central nervous system, the brain is subject to many injuries and diseases. from Alzheimers to depression and many others. Many people struggle with psychotic disorders that make life difficult for them. Some have seizures so bad, it causes them to be unable to take care of themselves and even make their surroundings dangerous if not deadly. Lucky there is a way that severe seizures can be helped and its already in common use today. It’s called vagus nerve stimulation and its been used as a treatment for drug-resistant seizures for years now. The treatment is simple, place electromagnets at key areas in the brain involved with the seizures that would then provide electric impulses to those parts of the brain, stimulating them. The treatment works much similar to that of a pacemaker for the heart as it gives electrical impulses to stimulate an overactive muscle and regulate its function. The brain is also responsible for our mood. It lets us be happy, sad, and angry. Depression can be caused by a lack or two neurotransmitters, dopamine or serotonin. The disorder can leave those with it in a state of constant sadness and misery. When pills and mediation fail, there is still hope.

A wirelessly powered implant the size of a grain of rice can electrically stimulate the brains of mice as the rodents do what they please. The new gadget could help scientists better understand and treat mental health disorders such as depression. . . neuroscientists would like to electrically stimulate the brains of simpler animals as they scurry around, carry out tasks and respond to their surroundings. The scientists could power the implant as the mice roamed across a 6.3-inch-wide (16 centimeters) chamber lined with a magnetic lattice. The device was implanted in a region of the mouse brain known the infralimbic cortex, which is implicated in animal models of depression and anxiety. (Choi)

By electrically stimulating parts of the brain, we can control when they feel happy, sad, mad, full, or hungry. We could even turn people apathetic if we wanted, shutting off their entire emotional capabilities all together. Another part of the brain that we could manipulate is the pre-frontal cortex or the part of the brain that makes us who we are personality-wise. We could change someones personality so that they could be more outgoing, less nervous when speaking publicly, or have more sympathy for sheltered dogs and cats. There are ethical dilemmas to this all though. Many would stand against manipulation of the brain unless it would be for a life saving reason. In all, using chips in the brain to help us think clearer and be happier seems like the future.


This Cybernetic Hand Is the First to Give Amputees a Sense of Touch.

George Dvorsky. I09, 11.27.16

No, I’m not talking about the “Ill be back” kind of cyborg from the movies, I’m talking about the cybernetic prosthetics we can develop to replace real limbs. Amputees may have lost their limbs due to an accident or disease but thanks to cybernetics, they can have artificial limbs that allow them to hold things and write again and even feel. One experiment has already helped a man feel again after losing his arm in a firework accident 9 years prior.

LifeHand2 allows amputees to feel sensory-rich information in real time. The futuristic device, which transmits signals to nerves in the upper arm, is not yet commercially available or portable — but it represents an important proof-of-concept as researchers work to create more life-like assistive devices. . . Dennis Aabo Sørensen will go down in history as the first cyborg to receive an artificial limb that allows for the sense of touch. When using the new hand, and while wearing a blindfold, the 36-year-old Dane could detect how strongly he was grasping, as well as the shape and consistency of the various objects he picked up. (Dvorsky)

While we are far from were we want to be with this technology, the potential is promising. Once we have mastered it, the prosthetics arms could be able to lift more than an average man and make difficult heavy tasks easy for those with the prosthetics. But again, we are far from having full sensory capabilities in prosthetics. For now, we are thankful for what progress has been made towards this technology. Having the ability to write ones name on paper again or hold the hand of a loved one is a precious thing for many amputees. Though we often take some of these aspects of daily life for granted, cybernetic prosthetics allow for all to be able to share the joys of the little things in life.

Cyborgs Part 2

How Far Can the U.S. Military Go to Building a Technology-Enhanced ‘Super Soldier’?

Steven Metz. World Politics Review, 11.27.16

Okay, this time I really am talking about the cyborgs we see on the big screen. The ones with grenade launchers and flamethrowers in their arms and high-tech targeting systems in their eyes. Humans outfitted with cybernetic upgrades for combat seem like the soldiers of the future. As cool as this seems however, ethically there are many challenges with enhancing soldiers for combat.

While still in the realm of science fiction, someday genetic engineering could combine with technological and pharmaceutical human enhancements to allow the military to create what it might see as the perfect soldier. But however useful the practice might be on the battlefield, it would also raise very troubling ethical questions. Would a military veteran who has been irreversibly enhanced in some way be able to assimilate back into civilian society, or would the veteran have what was seen as unfair advantages over unenhanced humans, thus creating resentment? Would veterans be considered less human than non-enhanced people? If so, would they be ostracized the way Vietnam veterans were in the 1970s? As Col. Dave Shunk asked in Military Review, “Will genetic engineering, neurobiological augmentation, and specialization prevent demobilizing soldiers at the end of a conflict,” thus relegating them to a life apart from the society they served?(Metz)

If we were to one day start enhancing our soldier with cybernetic upgrades we would need to make sure we do so ethically. We would not want to draft soldiers into the army just to mutliate them with enhancements and then have them return and expect them to go back to being normal citizens. When combining technology and biology there is a dangerous line that, if crossed, could lead to restricting the humanity of those we experiment on. we always hope war never leads to drastic measures, but sometimes it does.

Living Forever

The Singularity Is Near: Mind Uploading by 2045?

Tanya Lewis. Livescience, 11.27.16

There is a lot that cybernetics can do for people of the world. But perhaps the most incredible thing we can accomplish with cybernetics is achieving immortality. The idea is that we could somehow upload our minds into servers and download it into body after body as we continue to live on and on in the new bodies but with the same mind. It’s called whole brain emulation (WBE) or mind uploading and while right now it is just a hypothetical application of cybernetics, it could very well become real in the future.

According to Moore’s law, computing power doubles approximately every two years. Several technologies are undergoing similar exponential advances, from genetic sequencing to 3D printing, Kurzweil told conference attendees. He illustrated the point with a series of graphs showing the inexorable upward climb of various technologies. By 2045, “based on conservative estimates of the amount of computation you need to functionally simulate a human brain, we’ll be able to expand the scope of our intelligence a billion-fold,” Kurzweil said. Itskov and other so-called “transhumanists” interpret this impending singularity as digital immortality. Specifically, they believe that in a few decades, humans will be able to upload their minds to a computer, transcending the need for a biological body. The idea sounds like sci-fi, and it is — at least for now. The reality, however, is that neural engineering is making significant strides toward modeling the brain and developing technologies to restore or replace some of its biological functions. (Lewis)

While many of us would like to cheat death and live forever, there are some who would challenge the ethics of putting a brain into a computer. That is understandable as the idea today is very taboo. One day, we could see the end of humanities greatest plague, death, and go to build epic civilizations of immortal beings. Overpopulation wouldn’t be a problem anymore as we could just upload the others into a computer until we can start colonies on other planets or find a more sustainable food source. Our greatest minds could live on forever, pondering life’s important questions and driving the technology of our society forward. Families wouldn’t need to worry about the pain of losing loved ones to tragic accidents or terminal illnesses. We could even upload our pets minds to computers so they could live forever too.


Cybernetics is the ever evolving, ever growing communication between man and machine. It can be used in many cases for good and is a powerful resource for giving second chances at life. For many, it gives them the chance to hold their loved ones and run with their friends again after tragic accidents that left them paralyzed. We use it everyday from our phones, to our cars, to our TVs. We dream of one day in the near future of having cars that drive themselves, making our roads much safer. While the ethics are questionable, cybernetics could be the key to unlocking our immortality and finally beating death. Cybernetics is the communication of the present and the communication of the future.

Works Cited

Choi, Charles Q. “Wirelessly Powered Brain Implant Could Treat Depression.” LiveScience. TechMedia Network, n.d. Web. 27 Nov. 2016.

Dvorsky, George. “This Cybernetic Hand Is the First to Give Amputees a Sense of Touch.” Io9. N.p., 2014. Web. 27 Nov. 2016.

Hayles, Katherine. “Katherine Hayles, How We Became Posthuman, Prologue.” Katherine Hayles, How We Became Posthuman, Prologue. N.p., n.d. Web. 27 Nov. 2016.

Hsieh, Esther. “Flexible Spinal Implants Help Paralyzed Rats Walk Again.” Scientific American. N.p., 2016. Web. 27 Nov. 2016.

Hyman, Ira. “Cell Phones Are Changing Social Interaction.” Psychology Today. N.p., n.d. Web. 27 Nov. 2016.

Ion, Florence. “From Touch Displays to the Surface: A Brief History of Touchscreen Technology.” Ars Technica. N.p., 2013. Web. 27 Nov. 2016.

Keogh, Scott. “The Dangers of ‘Self-Driving’ Car Hype.” WSJ., 2016. Web. 27 Nov. 2016.

Lewis, Tanya. “The Singularity Is Near: Mind Uploading by 2045?” LiveScience. TechMedia Network, n.d. Web. 27 Nov. 2016.

Metz, Steven. “How Far Can the U.S. Military Go to Building a Technology-Enhanced ‘Super Soldier’?” World Politics Review. N.p., 2016. Web. 27 Nov. 2016.