In a previous blog, I talked about extended enterprise and how Snowflake can provide the data architecture to support it. Today I will try to illustrate it with a more realistic scenario. The scenario comes from my previous manufacturing experience but can be adapted to any industry: Several parties need to share their data to get the full picture of a product and then estimate the production costs. This could be part of a Design-to-Cost business process aiming at optimizing product design and fabrication to
In manufacturing, there are multiple use cases where you need to combine ERP, PLM, and IoT data to produce valuable business insights.
The challenge is still that this data lives in silos, sometimes in different divisions, or can even be collected by third parties, and the sources are a mixture of structured and semi-structured. …
This week, I’m running out of time for a solution deep dive so what is best than a 45min data integration challenge? Here is the objective:
The GDPR (General Data Protection Regulation) will take effect in May 2018 and will affect every organisation that collects or handles data relating to EU citizens.
The first step towards compliance for most organisations is to be able to audit the data to understand where it comes from, how it was processed, with whom its shared and under what consent.
The ability to manage metadata alongside the data in an operational system is a key enabler for the compliance of the solution.
This week, we will illustrate how MarkLogic can easily manage data and metadata in order to create an operational Datahub able to enforce compliance rules. …
As we explained in the part #1, in MarkLogic we can store PLM objects (Part, Assembly, etc.) and product structure section as XML or JSON documents. MarkLogic being an operational database, it provides all transactional capabilities required to perform read/write access to these objects.
Today we will deep dive into MarkLogic semantics and multi-model capabilities.
We mentioned before that the product structure management can vary from one solution to the other.
As the objective is to create a unified source of truth for all PLM data coming from internal and external (providers, partners) PLM systems, it’s also important to be able to manipulate the business concepts using a shared knowledge. …
I have been working for manufacturers in the automotive and aerospace industries for about two years. During this period I met industry experts in PLM, technical documentation and after sales from client and partners.
This industry is quite new for me so the below post is a humble contribution which aims at illustrating how MarkLogic can quickly deliver value in such a context as we do for other industries and with minimum development.
In this first part, we will present:
My first personal computer was an Amstrad PC1512, a long time ago in a galaxy far, far away… but since nothing had really changed. May be the floppy disk, the number of colours (there was 2… black and white), and other minor details…
I have been testing @Shadow_France (https://shadow.tech/)for a month now.
I’m based in London and the company had only opened subscriptions in France last year but I gave it a try.
I’m really addict to my MacBook but I must admit that personal cloud computing is now mature and this is the future of personal computing. Shadow proposes 8 dedicated threads on an Intel Xeon server processor, with 12GB of RAM, 256GB of storage space (not that much), Windows10 and a nvidia 1080GTX GPU (actually a Pro card, so almost best in class option at the moment). …
As we are starting a new year, let’s have a retrospective of some of the main solutions powered by MarkLogic of my 2017 year.
We are going to give an overview of: