A Glimpse of Tomorrow

A Fiction of What Will Be:

Projected UI

My personal digital agent(PDA) is telling me that the iSight link is being established for the meeting I have been waiting on with our partners in Beijing. I stop for a moment to let the augmented display reset from my browsing and work flow session and set my transparency really high, I am moving through the city and want to take it in. I know the collision avoidance would keep me fairly safe for a more immersive meeting but I just find this translucent layered approach more to my liking when I am active. As I wrap up the conference call my head begins to throb, too much caffeine. I make an offhand remark during our call about the pain and as soon as I disconnect the first thing my PDA brings into my visual workflow is a map overlay and directions to a nearby pharmacy that has a e-pon for 10% off aspirin if I come in within the next 30 min and GIS validate it.

The Semantic Web has become a reality and the growing collective intelligence is using my attention data in ways that mostly help but sometimes hinder workflow. Never forget to inform your PDA that you are going off profile while researching Prada bags for your wife. I spent a month trying to clear the “helpful” sales announcements for all things high end and feminine. I of course could opt out of the “helpful” items and do the leg work myself but I will be damned if I ever have to manually locate the nearest Starbucks again and since my Essence v6.2 PDA knows my drinking patterns, it’s not like it tells me every block, yet another Starbucks coming up on your right.

Back at the office I slip into my Tactile Feed Back(TFB) cover and walk over to my work surface. My hands slid through the air and I get into work flow management, PDA’s are great but they don’t know when you want to take a few and drop into the Metaverse to hit a hot new club like the Blacksun and chill with your Avboys. After a few dance like moves I have organized my inner-space, what did we ever do without visual recognition? I seem to recall, back before my first gene therapy rejuvenation treatment, back in my first round younger days having to drag and drop with a mouse in a flat 2D interface. Ugh how inefficient and frankly prehistoric, but we also once taped out Morse code to send messages.

As I look over at the wall the surface becomes a video interface and lets me know that my three o’clock is just about to knock on the door and my PDA asks what response to send our guest. I am just about to reply when my carbon limit indicator starts to go off, damn. I am horrible at remembering to use my bio-cell backups and my foot print is dangerously close to the taxable range. Ah well I guess my account will be debited the fee. Before I can turn to greet my guest, my financial indicator lets me know that in fact that transaction has occurred on my account. That quick buy and sell order on some Bitcoin that netted me small profit yesterday has now been paid to the government for my extravagant use of my carbon foot print allowance, I knew I should not have driven my classic Jeep into the city earlier this week to go see a show at the Voxelodian, it is always better to let my Tesla Zx drive me itself.

I swipe away all the items in my workspace and walk to the door giving my home AI the thumbs up to open the door. I greet my old friend Tom and we do the obligatory memory share. My vision is filled with the 3d highlights of his son Timmy’s basketball game last night as well as a tease of the amazing dinner he and his wife had at that place we spoke about last week. My PDA knows how much of my personal space to share and how our social graphs interface. He receives a package teaser of my highlights of the last week as well and we move into the den to share a pint. Some things never change.


Originally published at www.ihaverobots.com on April 13, 2016.