This week, DataSeries, an OpenOcean led initiative hosted another Virtual Roundtable together with Jennifer Schenker about “How SMB facing tech startups are seeing 2020 as an opportunity”.
The following article was published via The Innovator and below you can find the summary.
Small and medium-sized enterprises have been the lifeblood of the European economy, accounting for more than two-thirds of the workforce and more than half of the economic value added. The effect of the COVID-19 crisis on SME performance is immense and a recent McKinsey report based on a survey in the UK shows that 80 % of SMEs say their revenues are declining. They also report several related effects: concern about defaulting on loads (1/4); concern about their ability to retain employees (24 %) and doubt in their ability to sustain their supply chains (28 %); expectations of reducing headcount in the aftermath of the pandemic (28 %) and postponing growth projects (36 %). Due to the pandemic, one fifth of businesses now expect to default on loans and lay off staff. …
This week, DataSeries, an OpenOcean led initiative hosted another Virtual Roundtable about “The Power of Graphs” together with Jennifer Schenker who published an article about this here:
How graphs will transform data management and business
Complexity of connections — Why knowledge graphs?
If you have a dataset that is very much biased towards the connection of the entities, more so than the entities themselves, then relational databases are not the best solution. Knowledge graphs apply semantics to give context and relationships to data, providing a framework for data integration, unification, analytics and sharing. …
The exponential growth and requirements of AI use cases
The increasing complexity of AI models and the explosive growth of AI model size are both rapidly outpacing current innovations in computing resources and memory capacity available on a single device. AI model complexity now doubles every 3.5 months or about 10X per year, driving rapidly increasing demand in AI computing capability. …
This week, DataSeries, an OpenOcean led initiative hosted another Virtual Roundtable about “Synthetic Content and Deep Fake technology”.
The ability to recognise deep fake pictures by humans is very low, while the ability of an AI to detect them is fairly high. Facebook’s Deepfake Detection Challenge, in collaboration with Microsoft, Amazon Web Services, and the Partnership on AI, was run through Kaggle, a platform for coding contests that is owned by Google.
The best model to emerge from the contest detected deep fakes from Facebook’s collection just over 82 % of the time. It was argued that, at the current level, we won’t get to 90%.
On the other hand, the percentage of deep fakes currently circulating on Facebook is in the single digits and there are many other sources of misinformation. …
This week, DataSeries, an OpenOcean led initiative hosted another Virtual Roundtable about “Small Data”.
Small data is also about what you don’t know and whats in people’s heads
Typically when people think about small data they think about the lack of quality data and small data sets. A much more fundamental problem is to discover the right data to solve your particular problem. Often times companies try to analyse big data sets to find answers, where the real valuable data that needs to be uncovered is in our heads. Tacit knowledge is something humans can not easily articulate. The real question is how to capture cognitive data. There is a clear push design system to capture this data more effectively, but at the same time Gary Klein argues that collecting in-depth expert knowledge is crucial to understand decision-making. …
In October, DataSeries, an OpenOcean led initiative hosted a Virtual Roundtable about “AI-based decision making” together with Jennifer L. Schenker, the founder of The Innovator. This also led to an article that you can find here: AI-Decision Making: State Of Play And What’s Next
The majority of corporations are not ready for sophisticated AI implementations:
Data (schema) standards and structure needed
FinnAir, an airline that dominates domestic and international air traffic in Finland, thought it could use AI to manage airport congestion.
AI alone was not up to the job so Finland’s largest airline instead implemented a hybrid system that uses AI to make predictions about air traffic and allows the humans-in-the-loop to make better decisions, explains Tero Ojanpera, CEO of Silo.ai, a Finnish AI lab that specializes in bringing cutting-edge AI talent to corporations around the world.
Getting the FinnAir project to that point was not a question of plug and play. It required a complex multi-step modeling process to help the organization become more AI literate. …
Anything that can be intelligently automated, will be.
In 1999, as Steven Spielberg was preparing to make the movie “Minority Report”, he assembled a team of 15 technology experts to help him depict the world as it would look in 2054, the year that the movie takes place. The result was an impressive and somewhat dystopic future scape where technology permeates our lives.
It is too soon to say whether the vision of the future depicted in the movie will become reality, but 18 years after the film’s release, artificial intelligence (AI) and what is often called intelligent enterprise automation have had a profound impact in some areas. …
In July, DataSeries, an OpenOcean led initiative hosted a Virtual Roundtable about “Small & Synthetic Data”.
85% of the data collected is “small data” (IDC, Gartner…)
The challenge of small data is different from big data, as combining data sets is a rarity, hence decent data quality is crucial. The data must be in a standardized format. An obvious challenge is that if a pattern isn’t present in the data, then there is no way for the algorithm to learn it. …
In June, DataSeries, an OpenOcean led initiative hosted a Virtual Roundtable about the future of “Conversational AI”.
4 notes/requirements on our journey to maximize total value out of virtual assistants/chatbots.
1. We are on the cusp of creating a global Virtual Assistant paradigm that will be able to rule entire processes instead of having thousands of spread out assistants specialized in their fields. Having 10,000 or even 100,000 specialized domain assistants is not the most effective way to reach a truly groundbreaking level within conversational AI that is capable of navigating through complex ecosystems. …