How CraneAi uses Artificial Intelligence to help teams build apps faster

Behind the scenes look at how CraneAi’s uses artificial intelligence to empower teams

Ryan Hickman
CraneAi
5 min readNov 30, 2018

--

Computer vision analyzing UI

With so many advancements in machine learning we thought through how we could design a process that leveraged AI to develop applications. We had to teach our AI how to think and behave like a designer and developer, that way it could act as a designer or developer helping teams create their apps.

“Programming isn’t about typing, it’s about thinking”
~ Chris Wanstrath — CEO at GitHub

Behind our AI tech

Over the last few decades other industries have proven that machines can do certain things faster and often better than humans. We’ve identified those things in app development which machines could do better today and began training machines on how to do them. It’s critical for us not to replace designers or developers, instead we focus on ways to augment them improving their productivity. If a machine can tackle the redundant tasks between apps, it allows for the humans to make more meaningful contributions to their projects.

When first starting we built our own GPU machines by hand to detect objects in images. Detecting fruit, people and furniture for starters helped us understand the mechanics of how human eyes differ from a machine’s eyes. We quickly graduated to detecting features and UI elements in wireframes. After seeing the machine perform it was clear we needed to scale our techniques and do so we needed more GPU horsepower. We layered strategies together in a way that would allow us to overcome design nuances which was a considerable challenge.

Today our teaching (or training) is on powerful GPU-based Nvidia DGX Stations which improve our accuracy dramatically. We take a stern approach to reduce bias in our training, showing our machines a diversity of apps, styles, workflows and code from a diversity of team members with varying levels of experience, familiarity and background in app development.

Unboxing Nvidia DGX Station

What our engineers teach our machines:

How to see 👀

We use computer vision — the same technology that self driving cars use to see people and read road signs, we use to see UI elements and read components in wireframes. We developed custom models and pipelines that can comprehend design, user experience and infer complex composite elements. Learn how we train an ai model to understand design.

How to read 📚

Starting with over 16 millions of lines of code, swagger specs and API documentation we trained a comprehension engine and knowledge graph that can understand code patterns and intent.

How to think 🧠

Using deep learning we teach our machines to understand what its reading and seeing in the projects it studies and how planning, execution, bugs, risks and speed effect progress.

How to write 📝

…Code, tasks and documentation. Using the images, screens and information provided by you our machines can create and manage tasks, estimate timelines, write production-quality code, document how it works and debug the human created code added to it.

Key ways CraneAi uses AI (features)

With powerful intelligence packed under the hood, product teams can use CraneAi to build higher quality apps faster using features that assist them just like having another designer, developer or project manager on their team.

Saving time recommending navigation flow with AI

When new screen concepts are introduced to a users app project, AI is used to consider how the user will navigate to and from the view. CraneAi’s familiarity with apps recommends links between views and creates the connections between them.

CraneAi identifies gaps in the navigation flow as well. Whenever there is missing links in the flow CraneAi highlights them to help teams uncover roadblocks that break software.

Improving project planning with AI

Whenever users submit concepts to their project everything is examined. Using the knowledge amassed from training, the machine looks at the design styles, placement of elements and relationship to each another, general purpose of the view, how the view fits into the overall project, navigation requirements associated to the view, security, networking operations, events and testing requirements. Using these findings specific tasks are created which are then managed by CraneAi. CraneAi sets a due date, owner, time-budget and follows through with human team members until the project is completed.

CraneAi adds tasks based on machine learned task dependencies. For instance if a feed of posts is detected, CraneAi will create a data-bound feed and feed template with associated tasks rather than simply recreate exactly what it sees in the view. Including network operations, state based conditions (isloading, willload, didload, etc).

With machine-assisted project and task planning teams can more accurately estimate what is takes to deliver apps, align on requirements and have a clear source of progress. As new ideas are introduced, new work is committed and requirements grow in detail the project plan adjusts in real time making it difficult for scope-to-creep without stakeholders being well-informed before making decisions.

Tasks generated by AI

Communication with Github

Each time new code is committed to the project or there is a change to the repo CraneAi ingests the details of those changes mapping how they impact the project. In some instances tasks are automatically adjusted with updates and often automatically closed out. In other cases bugs are detected, code-style issues are identified and new tasks are added to reflect the associated risks of these changes.

Commit alert message in sync’d slack integration

We’re always looking for new team members who share the vision of helping reimagine how apps are developed. If you have experience with AI and love to learn shoot us an email team@crane.ai. 🙏🏽

--

--

Ryan Hickman
CraneAi

Passionately focused on building and investing in Artificial Intelligence and the Blockchain