Ideas Lab
Published in

Ideas Lab

Ideas Lab’s Product Evolution in 2022

At the end of Q3'21 last year, a fascinating paper came out exploring the various dimensions surrounding athlete preparations for the 2022 Beijing Olympics —including nutrition, training, technology and data. A month later, an article out of the Beijing Institute of Technology explored the various ways 3D motion capture technology helped drive an “Intelligence Training Management System, specifically for alpine skiing, cross-country skiing, ski jumping and speed skating. With every twist and turn on the mountain hurtling at 30 miles an hour (in the case of speed skating), decisions by both body and mind in split seconds could spell the difference between a win and a loss — or more critically injury or safety.

Bridging motion with intelligence through the use of physical sensors is not new — nor, we would argue, using motion capture technology to understand an individual or a group in a specific motion. Technology runs fast, and any technology that’s been around for more than a year starts to become ancient. So, like the Red Queen, our team at Ideas Lab has been running faster.

Admittedly, we write this update somewhat delayed but still deeply entrenched in our Q1'22 initiatives. Below we have provided an update to our customers, investors, advisors and partners around the world on our various initiatives across our technology, product portfolio, and, most importantly, our growing Ideas Lab family.

Specifically, these past several months we’ve been working on three major initiatives:

  • Academic partnerships — notably with several universities looking into specific deliverables and technologies (below we highlight one as an example)
  • Xview product — the ability to upload videos and have, within minutes, up to 25 body points detected, analyzed and visualized. The natural evolution is to then provide prescriptive analytics around improving strength, decreasing chance of injury, and overall offering an AI-driven but actionable recommendation to improve. By working with domain experts — both for specific sports as well as biomechanics — we’re building our system to translation motion into meaning.
  • Project Fury — focused specifically on golf, we plan on introducing golf’s first real-time full body motion and golf club tracking application without the use of sensors. The app will provide golf biomechanics feedback while plotting the body and golf club trajectory in a 3D world with input solely from a 2D video! Additional upcoming applications include “deepfake” motion replication in the world of visualization and functional equivalence training. The latter — and other elements of the platform we’re building — are in a stealth phase, with plans to disclose more of what we’re launching by Q3'22.

First, about that growing family…

We recently brought in one HR manager, Cynthia Cheng to help support and overall structure all employment-related items. Welcome Cynthia!

And while we’re here, interested in joining the team? check out some of our open positions here:

Again, we’ve engaged recruiting agencies to support our HR efforts, notably TalentMade, Glint and Paul Wright Group.

And now, to the technology…

Project Fury golf analytics:

Using both traditional computer vision and proprietary data and AI models, we can now track 33 body joint positions compared to 17 joint positions earlier. We’ve extend our 2D model into 3D by training golf AI CNN models based solely on golf swings so that by only provide video as input, we can recreate a golfer and its motion in 3D. The output model will be run on edge/iOS devices with target 20–30fps inferencing. Our objective is to have the most accurate 2D →3D model within domain specific sports by not only improving pose estimation methods but also augmenting existing 3D datasets. Our ability to augment data is a key advantage we have over other companies.

In addition to the 3D model, we incorporate a real time golf club tracker which tracks the path of the golf club without the use of sensors, or watch — No manual annotations needed! Analytics will include club speed, club path, club acceleration, hand speed, shaft lean, tempo, arm lean.

Hip Swing Analytics

In various sports, power comes from the base — in golf, the hips in particular. With that in mind we are building a system which analyzes the hip swing movement by taking the full golf swing, identifying four key moments of the swing, and visualizing the movement of hip. We can build in a comparative line (say, your hip swing movement vs. Tiger’s, or your hip swing movement vs. an older, less experienced you) that can show where the delta between you and a competitive swing most lies. Check out an example analytics below which tracks the hip’s vertical and horizontal movement across the swing.

XView Platform

Last year we built out a foundation of our platform of a basic web interface with an uploader, a video player, and data visualization layers. In Q4, we mainly focused on containerizing all our existing AI model services and designing a unified data schema and API for all our services.

Work has also been completed on wrapping up callable network services which are used to service API requests both internally and externally. Initial analytics work on the XView platform will include boxing and golf analytics. Furthermore, a unified data schema is in place as a common language to support data requirements regardless of sport domain.

Now, our platform is powered by most of our core technologies. With the unified schema and interface, we enabled new possibilities of our central servers for interacting with apps, and even our clients or third party developers whom has their own customized interface and seeking the “brain” of their applications.

We also redesigned the standard of the file processes. As there are now more and more core AI models consolidated and data uploaded into our system, we need to improve our data pipeline to make sure that we handle all our clients’ data nicely and correctly. Having a robust pipeline ready also helps us smoothly build a more distributed system, as this quarter we’re also moving some of the components of the system, including the file storage, to dedicated machines and services such as AWS, to reach better stability and scalability.

We’ve also recently acquired servers hosting Nvidia A6000 GPU cards and Titan RTXs capable of processing up to 48GB of VRAM. This will allow us to integrate and train models quickly and more effectively. Data center selection is currently underway.


Our partnership with two researchers at FCUAI and NTUS on pitch spin axis and rate identification has been steadily progressing. Again, the core objective of this project is to identify the spin rate / axis of pitches while streaming baseball games. The current system seen across millions of television, web and mobile screens, is Statcast (Hawk-Eye), which leverages a series of 12 cameras placed strategically around the stadium. Hawk-Eye’s use of pose tracking, a major innovation over previous doppler/video technology used in prior years, helped propel MLB forward.

Our system involves training a computational neural net (CNN) model to identify the spin axis / rate of pitch. We are aiming to specifically to identify the difference between generated and real data, and to train our models to identify the spin axis / rate of real pitches. Specifically, we’re looking to simulate ball spinning using a baseball 3D model, using the orange line in the image as the axis and simulating spinning in 1,500–3,000 rotations per minute (rpm) randomly (the simulation doesn’t consider gravity, air resistance and Magnus force).

Since the ball images collected in actual games are relatively small (48x48 pixals), the ball images integrated into the model are similarly small — requiring further image processing to liken the balls considered in our system to be more similar to real life.

Exciting days ahead on this partnership. If interested in learning more, don’t hesitate to get in touch!

Additional partnerships include our work with National Cheng Kung University and an additional project with NTUT related to a gamified 3D avatar. More to speak on this in later posts!

Where we’re going in 2022

Some exciting days ahead — specifically:

  • Launch of Xview, currently focused on boxing, but which will encompass a broader range of sports and domain-specific metrics
  • By Q2'22, we’ll have a testing for our golf product in iOS. Click here to join the growing beta community.
  • We’re also exploring partnership opportunities with both retail players in the sports & fitness industry as well as complementary technology vendors.
  • Last, we’re looking to expand our advisory board consisting of biomechanists, fitness executives, researchers and other thought leaders helping us build a market-leading product in the sports AI.


Ideas Lab is an innovation lab and start-up studio building proprietary artificial intelligence, machine vision, and human motion analysis technologies. Today, while developing a suite of AI-based solutions, we are building a network of corporate and academic partners with whom together we will improve human performance in the many ways people move, perform and play!

Visit us to learn more about Ideas Lab today!




Exploring issues and topics related to the core technologies developed by Ideas Lab

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ideas Lab

Ideas Lab

Ideas Lab enables data-driven insights from the way people move, perform and play.

More from Medium

Building a Smart Sales Structure Part 2: Shortening the Sales Cycle

AI-Augmented Humans = Better Customer Experience

Landing’s 2021 Year in Review

3 Steps To Transform Financial Services Customer Experience