Say Hello to App RetargetDreams

App advertisers can now retarget their users based on what they dream.

It may look like he’s sleeping, but he’s feeding dream data to the phone.

In the last few years, Jampp has launched a number of innovations to help advertisers personalize ads to retarget their users with relevant offers.

With Dynamic Product Ads for example, we can match a user’s search intent or shopping activity with an ad impression to dynamically create ads that bring them back to complete a purchase or suggest complementary products.

Also, thanks to our Predictive Bidding technology, we can leverage shopper behavior and contextual data to create user segments based on the likelihood of purchasing or converting, thus paying more to show ads to those who are closer to completing a purchase.

Sweet Dreams are Made of Feeds

But is that really enough? What about sleep time? Yes, what about when people are sleeping?

Most smartphone users charge their phones next to them while they sleep and according to a survey by ReportLinker, nearly half of Americans check their smartphones as soon as they wake up, while they’re still in bed.

Most humans sleep an average of 8 hours [1], which accounts for a third of their day, a significant portion of time in which they are not exposed to ads.

When confronted with this data, we realized that that first phone pick up after sleeping is the best time to show consumers a retargeting ad: Their brains and bodies are rested, they haven’t been exposed to any other ads in the previous 6–8 hours and they are still in a dream-like state that makes them more responsive to advertising.

So we took it to the lab.

It was right there, in front of our noses, we only needed a magnifying glass to really see it.

What we discovered is that when Android or iOS devices are plugged into the wall, we can combine the Bluetooth and WiFi chips from the phones along with the natural radiofrequency electromagnetic radiation (EMR) from the ionosphere and accurately produce a data feed that imports into the phone, in real time, the average person’s dreams during the REM (Rapid-Eye Movement) stage of sleep.

The data is then encrypted and uploaded to Jampp’s servers where we match it to a database of 32.3 million products and services from our advertisers’ apps. Once there’s a match, the platform dynamically creates an ad with the product that most closely resembles what you dreamed of.

Some of the dreams processed triggered alerts to the FBI and other agencies but we quickly started filtering those out.

The project, which took almost 2 years, was a team effort of epic proportions and was kept as a company secret until today. Its code name was “Eurythmics”.

It was hard having to explain to our coworkers that takings naps at the office was an actual job requirement and part of a project, even more so, things could get a little heated on Claudio’s napping turns, as he is known to be a heavy snorer and the siestario was right next to a conference room.” said Santiago Hernandez (Lead Dream Analyst), referring to Claudio Freire (Head of Dream Catching)

App RetargetDreams is only the first of a series of projects being launched by Jampp that look to have a more direct (intimate) relationship with app users.

“At Jampp we are always looking for creepier ways of retargeting you, and (literally) getting into your dreams was the next natural step. The technological breakthrough to achieve this is sure to be a Nobel prize finalist. Furthermore, we are close to being able to reverse the process and incept product cravings into your dreams while you sleep [2] [3].” said Guido Crego, Jampp’s Head of Product. He also added, “No, there’s no app for what you dreamed last night, you should see a therapist”.

As of today, Jampp’s App RetargetDreams technology is available on alpha to all of Jampp advertisers.

Sweet dreams!

References

[1] Global sleeping patterns revealed by app data
http://www.bbc.com/news/health-36226874

[2] T. Horikawa, M. Tamaki, Y. Miyawaki, and Y. Kamitani. “Neural Decoding of Visual Imagery During Sleep.” Science 340.6132 (2013): 639–42. Print.
http://science.sciencemag.org/content/340/6132/639

[3] Shinji Nishimoto, An T. Vu, Thomas Naselaris, Yuval Benjamini, Bin Yu, and Jack L. Gallant. “Reconstructing Visual Experiences from Brain Activity Evoked by Natural Movies.” Current Biology 21.19 (2011): 1641–646.
https://www.sciencedirect.com/science/article/pii/S0960982211009377

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.