TUBR: Taking on “small data” problems — pt.II

Ksenia Kurileva
Aerospace Xelerated
10 min readOct 15, 2021

The second part of our interview with Dash Tabor, Cofounder and CEO, about her product experience, managing bias in AI and much more

Dash Tabor, Cofounder and CEO of TUBR, joins Aerospace Xelerated for Ada Lovelace Week to share her founder journey, how TUBR is taking on “small data” problems and their time-series machine learning algorithm. If you missed part I of our conversation, head over here to check out the article.

You mentioned your experience and mindset as a former product manager. There’s this idea that product managers should be thinking like CEO’s, setting the vision and the strategy, and being able to lead a team. How has your PM experience shaped you as a founder? Has it been an advantage in your day-to-day?

I think that’s one of the things that you’re trained in, like you’re the ‘CEO of the product’. I’ve realised there’s a lot more to being a CEO than just product and I’ve learned so many other things. I do think one thing that’s helped me is my ability to prioritise. I prioritise everything all the time, and I think that’s one of the things that I have found quite easy because of product management. I can look at something and say, that’s great, we need to do it, but push it down the road because if we spend our time doing that, right now, it’s not going to move the needle.

TUBR: The team has designed a London based-app helping people find the optimum time to travel in order to have the best journey experience.

I think it’s easy to get sucked into admin work and things that won’t move the needle, and it’s something that I have to check myself with all the time. I think I’m also more willing to put something out there: like let’s get a small group, let’s put it out there and let’s see what happens. This is my fail fast and pivot approach. I don’t think a failure is a failure, I think it’s just an opportunity to learn. So you know, how quickly can we figure out that something’s not working? What do we take from this experience? I think the ability to listen to the market is equally important.

We pivoted recently and we went from focusing primarily on the data that the app provided us to selling the machine learning and that’s because we kept talking to people about what we were doing and people kept bringing us their use cases saying “Would it help me solve this?” so we’ve transferred from the app to a machine learning company where the app is just helping us validate and helping us continue to build machine learning because that’s the bigger opportunity. We’re now talking to people in financial services and healthcare for example, and the opportunity to solve a small data problem is mass and I think that that my product management trainings really helped me see those opportunities and then determine, is it worth putting a small business case together to validate where the value is and say is it worth just spend some time exploring this, yes or no?

It’s important to talk about the data gaps […] Working with companies to make sure that women’s input isn’t considered an outlier, and thrown out, will help create that collective view.

It’s interesting what you said about putting out the product and testing because, with AI products, you’ll often see startups that have the mindset of “we’re not ready yet” and waiting until it’s perfect, by their metrics.

That’s one thing we realised so we built this machine learning algorithm very much for our own use case. I started having all these conversations in other industries and we said “Well, we don’t 100% know what you know.” The team thinks it could work with that but that’s one of the reasons why we have to test so much now and we actually did some work for a financial services use case recently. Afterwards our CTO came back and he said he could clarify our machine learning so much more now. We can definitely fix that problem but it’s slightly different for various reasons, and we need to decide if we want to move into that space right now or not. At that point, we went back to them and said not right now but let’s put it on the roadmap. We know what we need to do, but this isn’t the space that we want to be in right now.

If we hadn’t gone through that process in the way that we did like let us both learn from each other and let’s see how we get on then I don’t think we would know now what that future roadmap looks like or that it’s an area that we can go into. That’s really what we’re looking for right now is how many of these problems can we tackle even if it’s that we do a bit of free work upfront to determine yes we can do this and then find that product market fit so we can replicate our roadmap is that we’ll be able to get to a no code solution in about four years.

Photo by Slidebean on Unsplash

We’re excited to have you with us for Ada Lovelace Week! 👩‍💻 I wanted to touch on this World Economic Forum 2020 Global Gender Gap Report which says that just 26% of professionals in Data and AI are women. I think that’s concerning because these are the very people that shape the products we’re using. In your opinion, what are the current barriers for women in AI? What can we — society, government, organisations, and individuals — do to improve this?

One of the things that I’ve been doing some research on recently is how women are trained differently, particularly in a fundraising perspective. We get asked different types of questions, and I think that’s probably the same when we look at working with women in the AI field. The fact that we’re probably asked more about what can go wrong with our AI solutions, as opposed to what can go right. I think being aware of the type of questions that we’re asking women, when talking about their solutions will probably get a better response as far as feeling a confidence that women can do this, which we absolutely can.

It’s important to talk about the data gaps too. Our team at TUBR is so adamant about small data problems, instead of looking at this big AI. A lot of data on women is a small data problem, there’s just isn’t enough data collected on women in order to have a full view. I think working with companies to make sure that women’s input isn’t considered an outlier, and thrown out, will help create that collective view.

One of the things I’ve run across in my career is being dismissed because I’m a woman. One of my proudest moments was when I was sitting in a middle of a meeting with a bunch of men. I was new to the company and when I would say something. The people in the meeting would respond back to my boss. It was almost like I wasn’t there, like they were hearing what I was saying but they weren’t acknowledging that it came from me. At one point, my boss turned around and looked at me and basically said “We’re going to do whatever Dash says we need to do, so you need to work with her.” I remember thinking — yes, this is great. I think there just needs to be an awareness that this bias does happen. It probably isn’t intentional. It’s probably subconscious, because this is just gender norms that we’re living in. Putting out more of a dialogue and a conversation around it is really important.

Photo by Shahadat Rahman on Unsplash

Also, making data not scary is really important and I think women are the ones that can have that conversation and change that dialogue. That’s one of the things that we’re working on at TUBR, we’re starting to break into how people feel about data, what they’re scared of, what they think about it so that we can start pushing out some press around it. This isn’t scary, this is okay. If we’re going to be seen as nurturing and mothering then we might as well use it to our advantage.

Going back to the data. Now that you have your artificial intelligence (AI) solution, how will you manage bias in AI and take care of the hard ethical questions as TUBR scales? How do you plan to manage this over the next 5–10 years?

The ethical questions are something we’re actually quite cognisant of. From the beginning, one of our morals as a company was that we were going to be ethical. It’s one of the reasons why we’ve decided not to take on any personally identifiable information (PII). Even when we were talking about the possibility of selling the data, if it were to become personal at some point, then we would pay our users for the use of it. That is something on which we fundamentally said that we’re going to be different, we’re going to set a change there.

When we talk about dealing with biases in AI, this is something my technical co-founder has an entire strategy on and something that we’ve really sat down and thought about. Working with small data problems, you could potentially get the wrong predictions because you’re working with a smaller dataset. It’s something we need to be very careful and comfortable with. I think that strategy, what we’re creating right now, will be a basis for that future, and just you being aware of regulations being part of the conversation. That’s something else. I was the GDPR Sponsor at Experian so I’m quite comfortable with a lot of the rules that come along with that. At TUBR, we have a data policy, we might be early but we have a data policy and a retention policy. We have all the things in place that we need to in order to make sure that, as a company, we’re regulating ourselves.

Then there’s the piece around education. This is really important because I think a lot of companies are scared to talk about what they’re doing with the data and how they’re using it, and it’s so crucial to get the right language when you talk about this topic. This is why we’re doing a lot of market research around how people feel about data so that we can make sure that when we talk about it, we talk about it in a way that translates the same way that our policies are.

If I were to make one general statement, it would be: you’re not special […] if you learn that now, you’ll do so much better, because you’ll think about it in a way that is — how do I prove I’m special? As opposed to just ‘I am special’.

What’s in store for TUBR for 2022?

We’re going to start pushing out more of our analytics solution. We’re working on two different products from a prediction perspective. One is a pathfinder, where we’re working on being able to predict paths through open spaces, think hospital and retail. The other is just the traditional movement prediction so being able to predict in a smaller space, like we’re doing on public transport for the station at this point.

Look out for that and if anybody has any small time series, small data problems, we want to talk to you. We’re quite keen to understand what these problems are and if we can do something about it.

Photo by Leon on Unsplash

Thanks Dash! A quick-fire round of questions for any current and future founders reading this.

What is the best startup advice that you’ve ever received?

To be honest, I don’t think that I’ve ever received just one piece of advice but collectively, if I were to make one general statement, it would be: you’re not special. Everybody has ideas so don’t think that this is going to be the surefire thing and everyone else will drop everything they’re doing because you walked in the room. I think every startup, at the beginning, thinks that this is it, that it’s their idea and it’s awesome. I have a friend that is starting a startup right now and I’m always saying that you have to remember that you’re not special, and if you learn that now, you’ll do so much better, because you’ll think about it in a way that is — how do I prove I’m special? As opposed to just ‘I am special’.

Has there been anyone significant on your startup journey that has inspired you?

I have great advisors who have been wonderful. They’ve been able to help by pointing out things in me that need to change or things that I need to deal with in the team. But my friend, the one building the FinTech mobile app, is truly inspiring because he does not give up and he just keeps going and going and going. He’s got this contagious optimism. I didn’t appreciate what he was going through as a founder at the time when I was helping him with the startup. Now, I fully appreciate just what a hard position he was in and I can better understand some of his decisions as CEO and founder. I think that kind of keeps me in check because I just think back on how you know, how he did the absolute best that he could and how far he was able to push us, which is amazing.

The other thing is that I love hearing stories of startups that didn’t necessarily raise or didn’t take the traditional route, and were still successful. Whenever I hear one of those stories it always makes me go like: yes, this is possible, just keep trucking.

It was a pleasure to talk to Dash and hear her journey into the tech space. We’re excited to see what’s next for TUBR!

We hope you can join us for Ada Lovelace Week — use #AdaLovelace and #WomenInTech on socials to shout your support and celebrate the women you respect. You can follow the content we’re publishing this week via the #AdaLovelace tag here on Medium.

Aerospace Xelerated is a 3-month funded programme for exceptional autonomy and AI startups to accelerate the growth of the aerospace industry. Learn more about our work in our FAQ or book an Office Hours call to chat with the programme team.

Stay up-to-date with our latest updates via Twitter and LinkedIn and subscribe to our mailing list!

--

--

Ksenia Kurileva
Aerospace Xelerated

EIIS Circular Economy Management | Newton Venture Fellow | Startup Advisor & Mentor