It was a little after 9:00 pm on a rainy Wednesday night in May, 2020.
It had been nearly four months since Steve McDaniel and Corey Gaspard met and started spending all their free time writing code together. Following the release of the now infamous U.S. Navy video of the hazy UAP aptly-named the “Tic Tac” in 2017, they both had one very eccentric goal: to be the first software engineers to use artificial intelligence to catch a UFO on video — while also capturing its other physical signatures with a bevy of other specialized sensors — bolstering their chances of ensuring its authenticity and gaining measurable data.
The progress had been slow. Neither of them were familiar with the sensitive sensors they were coding for and testing, and that night seemed to be unfolding just like all the rest.
They had spent countless hours working out the kinks in the new software, while also testing different cameras. They made it a habit to record the videos of any aircraft that flew overhead for later use as training material for the AI, so it would be able to recognize routine aircraft while it was scanning the skies for UAP.
Once again that night, Steve sat watching the monitor as it displayed his latest bench-test camera’s panoramic view of the sky. As usual, three airliners flew in long, slow arches over the city, appearing on the screen as faint, white, blinking glints, captured intermittently as they moved overhead with the new camera’s ultra-low light sensor. The tedium of recording the plodding airliners was only slightly broken up by the punctuation of white flashes on the display caused by the light from the lightning in the electrical storm hitting the camera sensor on the roof of the building.
And then — just as the last of the three supposed-airliners approached the middle of the screen — it just stopped.
And it remained motionless, frozen in a single pixel onscreen, for what felt like an eternity, before it shot off directly towards the electrical storm that was now less than a mile to the east.
Everyone knows airplanes aren’t supposed to do that.
Jolted alert, Steve immediately began checking to make sure he was actually recording this event. He pinged the other Sky Hub team members, a handful of mostly STEM professionals who lived in other parts of the world, to get their opinion on the strange data. Everyone watched and re-watched the footage. They all agreed that whatever it was, it was displaying non-ballistic motion, but without more data from the planned sensor arrays — still in heavy development — no one could be sure what they had captured that night.
But that event shifted the momentum for the group, driving them to complete Sky Hub’s Tracker and Sensor Arrays, and to get the word out about the project. In that moment, the small team realized that there could be unidentified anomalous phenomenon flying overhead, constantly. It reaffirmed their hunch that a global network of AI-controlled sensor arrays could begin catching the signatures of unidentified flying objects as soon as the system is turned on — much like the surprising number of bacteria that Anton van Leeuwenhoek saw when he peeked into his first microscope.
Sky Hub is not the first sky watching UFO project. In fact, over the decades, there have been countless projects like this one — going all the way back to the “bird-watching” groups of physicists in the 1940s at Los Alamos, New Mexico for whom UFOs were so ever present they sat outside to wait for them to fly overhead. Even the U.S. Government has been running its own UFO programs for years. What has changed — and what makes Sky Hub different — is the availability of Machine Learning hardware and software, along with highly sensitive low light sensors, that only a few years ago would have been more costly than the GDP of a small country. Today that technology is ubiquitous and ready for the tenacious software developers in the Sky Hub team to aim it up toward the waiting sky and say, “Keep watch for us.”