Import and Export Data in Mendix using the CSV Module

Mendix Community
Published in
6 min readMay 29, 2021

--

Mendix supports multiple options to import data into an app, and all of them prove to be really good based on the requirements of the source type.

In this blog, I’ll focus on one of these methods which is importing and exporting CSV-formatted data, with the help of a Mendix offered module called the CSV module. When we have a lot of data to import, CSV is a great option when the data is coming from an Excel document.

The CSV Module offers some cool advantages here, which are mentioned in the Marketplace:

  1. This module offers a connector-style usage of CSV for import and exports.
  2. Advantages of this connector style:
  • Model reflects CSV structure.
  • Microflow can be optimized for importing / exporting (e.g. by caching).
  • Suitable for high volumes (from 100K’s to M’s). d) Streaming approach for higher performance and less memory consumption.

Woah! Sounds great already, Right? Well, what are we waiting for then, Let’s get started

Setup

In our Mendix Project, we need to download the below modules compatible to your version of Studio Pro from the Marketplace:

https://www.mendix.com/mendix-world/?utm_medium=referral&utm_source=Employee&utm_campaign=GL-CE-2021-09-06-Mendix-World
https://bit.ly/MXW21

Importing CSV Data

Approach

The approach that we’ll be following for Import is to upload a CSV format file in our app and utilize java actions from CSV module that will help import data in our application.

CSV-formatted Data as Input

Domain Model

We need to create an entity of type FileDocument that will store our CSV file, a main entity which will store the processed data (here Employee entity) and a Non-persistable entity.

Domain Model Setup

Here we use a non-persistable entity for 2 reasons :

  1. The incoming data supported via CSV is of type String only
  2. The data is fed to attributes after rearranging them in alphabetical order

Example : If we directly map data to the Employee entity, the CSV module actions would map Input data ‘2,Jorg,TechAdmin’ after re-arranging attributes in alphabetical order as ‘Designation: 2, EmpId: Jorg, Name: TechAdmin’, hence wrongly mapping data, and also we have ‘EmpId’ as Integer whereas CSV supports only string.

But the choice really is all yours, if you have an entity having alphabetically ordered columns with all string values then creating a non-persistable entity can be skipped.

Microflow and Pages

Firstly, we need to create a page to accept the file upload and trigger our Import action. (This page is called via a microflow, which creates a new object of CSVFile entity and passes that as page parameter to this)

Page to upload CSV file

When we click the ‘Import’ button we call our microflow to import the file. Let’s have a look at it :

Microflow to Import Data

This microflow takes the uploaded CSV File as Input parameter and the main work done here is to call the action ‘Import CSV’ from CSV module and configure it. The parameters taken by this action are :

  1. Pass the CSV File from which data will be read(In this case, we pass our Input parameter CSV File)
  2. Separator value should not be left empty and needs to be specified else we’ll run into Null pointer Error
  3. Skip Lines as Integer value
  4. The highlight is the microflow to be invoked for Import which does the main processing. The selected microflow in the screenshot is an example microflow provided by CSV module for reuse. We can modify it as per our requirement and let’s see the modification I have done.
Left : Original Example Microflow, Right : Modified Example Microflow

I have created a list of Employee object proceeding with java action ‘Read next line’ that reads the data from CSV file and mapped it to Non- Persistable entity ‘Employee_CSV’

Parameter to Read Next Line action

For each line read, we create an Employee object and map returned object from Read Next Line action to its attributes.

Parsing and Mapping returned object in Employee attributes

Once done the list of Employee objects is committed and we return to our main microflow and the import is done.

Result

CSV formatted data successfully imported in application

Exporting CSV Data

Exporting data to a CSV-formatted file is as easy as Importing and can be done in a similar way.

Let’s say we want to export the Employee data, we need to create a microflow as shown below :

Microflow created for Exporting data

Considering the same Domain model as used in Import, create a CSVFile Enity object, call ‘Export CSV’ java action found in the CSV module, it takes similar parameters as the ‘Import CSV’ action. But in this case, the microflow to be invoked needs to be created by us and at last, a download activity to provide the user with the file.

Export CSV action Parameters

Parameters to be passed in Export action
  1. Pass the newly created object of CSV File
  2. Separator value as per need, in this case its ‘;’
  3. Return type of this action is Boolean and we’ll use it to decide whether export was done or not
  4. The Microflow to be invoked in this case is the one we would create ourselves as shown below
Microflow to be invoked in Export action

In this, we retrieve the Employee data and iterate over it to map it first to Temp entity to process data type to String and also avoid re-arranging and then mapping of columns by the next java action ‘Write Next Line’, it is used to write data in our CSV file row by row in the correct order of columns.

Left : Parsing and Mapping of Employee data to Temp entity, Right : Parameters taken by ‘Write Next Line’ action

Once processing is done, we return back to our main microflow based on the return value of the ‘Export CSV’ action the CSV file will get downloaded for the user.

(Note : Again mapping data into Temporary Entity while export is your choice, here its done to avoid re-arranging by ‘Write Next Line’ action)

Result

Successful data export in CSV format

Summary

The CSV module is an efficient module offered by Mendix and has lot more features such as ‘ExportOQLtoCSV’, ‘EportSQLtoCSV’, ‘CountNumberofLines’ which can be explored and used to achieve Import/Export of large volumes of data at high performance and lower memory consumption.

Read more

From the Publisher -

If you enjoyed this article you can find more like it at our Medium page or at our own Community blog site.

For the makers looking to get started, you can sign up for a free account, and get instant access to learning with our Academy.

Interested in getting more involved with our community? You can join us in our Slack community channel or for those who want to be more involved, look into joining one of our Meet ups.

--

--