Custom setup for Power BI Citizen Development using .pbip and C# script in Tabular Editor 3

Mateusz Mossakowski
Microsoft Power BI
Published in
7 min readDec 12, 2023

This article is a continuation of the previous one. With invaluable insights from Daniel Otykier of Tabular Editor and Bernat Agulló Roselló of ESBRINA, I have successfully completed the configuration that aligns with our organization’s requirements for Power BI Citizen Development setup.

Recently, I’ve been focusing on a Citizen Development approach for Power BI users within my organization. I’ve noticed that relying solely on a live connection to existing semantic models isn’t always sufficient because users often require loading additional tables, such as targets or mappings. Therefore, I’m interested in enabling them to import data from the centrally governed semantic models into their solutions.

To avoid burdening the semantic models used for “core” reporting, my plan is to replicate these models in a separate premium workspace. This way, people can query the data from there without overloading the models used for reporting. Since these premium workspace models can be treated as Analysis Services (AAS) models, I can use relatively simple M code to wrap up DAX queries and enable querying.

Side note: Originally, the plan was to utilize dataflows for the citizen development approach. However, due to the lack of row-level security support in dataflows and the additional effort required for citizen developers to build relationships and measures on their own, I decided to rely on the premium workspace semantic models instead. By leveraging these models, access restrictions based on company rules can be enforced, ensuring that citizen developers can only access data within their authorized boundaries. This approach prioritizes data security while still empowering users to work with the data they are authorized to access.

Next step was to create another one-to-one copy of the semantic model. However, there is a key difference: instead of pulling data from the original sources, the data is retrieved directly from the premium workspace semantic model using the AnalysisServices.Database function. To facilitate this, I wrote a custom M function called fReadTable, which enables the retrieval of data from the tables in the premium workspace model. This function pulls the data in a one-to-one manner. Additionally, it cleans up the technical column names that include the table name and column name enclosed in square brackets so that the final column names stay untouched.

(_vTableName as text, _vWorkspaceCon as text, _vDatamodel as text) =>

let
sourceTable = AnalysisServices.Database(
_vWorkspaceCon,
_vDatamodel,
[
Query = "
EVALUATE
'" & _vTableName & "'
"
]
),
columnNames = Table.ColumnNames(sourceTable),
removeCharacters = (columnName as text) =>
Text.Replace(Text.Replace(columnName, _vTableName & "[", ""), "]", ""),
cleanedColumnNames = List.Transform(columnNames, each removeCharacters(_)),
outputTable = Table.RenameColumns(sourceTable, List.Zip({columnNames, cleanedColumnNames}))
in
outputTable

To ensure consistency, the M codes for all import mode tables should be switched to the below template. To implement this template, it is necessary to create two parameters: one for the workspace connection and another for the semantic model name.

let
Source = fReadTable("CAL DIM", #"01_Workspace_Connection", #"02_Datamodel_Name")
in
Source

While the number of tables in these data models isn’t excessive, it can still be a bit time-consuming and prone to mistakes or overlooking something. So, I had this idea of automating the process a bit. I thought about coding everything in a C# script (CitizenDevelopmentTransfromDataModel), which could be used as a macro in Tabular Editor 3. This way, it could be reused for any existing or future semantic models.

using Microsoft.VisualBasic;

// Define workspace connection
string workspace_connection = Interaction.InputBox("What is the workspace you are willing to source data from?", "Please provide workspace connection here", "powerbi://api.powerbi.com/v1.0/myorg/Your-Premium_Workspace");

// Define data model name
string datamodel_name = Interaction.InputBox("What is the semantic model you are willing to source data from?", "Please provide semantic model name here", "Your Data Model");

// Define function name
string function_name = "fReadTable";
// Define workspace parameter name
string workspace_param = "01_Workspace_Connection";
// Define datamodel parameter name
string datamodel_param = "02_Datamodel_Name";

// Create a custom fReadTable function
// It is meant to create a query to pull the data from the premium workspace data model acting as AAS database
// and to remove not needed prefixes and suffixes for column names so that they stay "unchanged"
var CustomFunction = Model.AddExpression("" + function_name + "");
CustomFunction.Kind = ExpressionKind.M;
CustomFunction.Expression =
"(_vTableName as text, _vWorkspaceCon as text, _vDatamodel as text) =>" +
"\nlet" +
"\n sourceTable = AnalysisServices.Database(" +
"\n _vWorkspaceCon," +
"\n _vDatamodel," +
"\n [" +
"\n Query = \"EVALUATE '\" & _vTableName & \"' \"" +
"\n ]" +
"\n )," +
"\n columnNames = Table.ColumnNames(sourceTable)," +
"\n removeCharacters = (columnName as text) =>" +
"\n Text.Replace(Text.Replace(columnName, _vTableName & \"[\", \"\"), \"]\", \"\")," +
"\n cleanedColumnNames = List.Transform(columnNames, each removeCharacters(_))," +
"\n outputTable = Table.RenameColumns(sourceTable, List.Zip({columnNames, cleanedColumnNames}))" +
"\nin" +
"\n outputTable";

// Create a workspace parameter
var WorkspaceParameter = Model.AddExpression("" + workspace_param + "");
WorkspaceParameter.Kind = ExpressionKind.M;
WorkspaceParameter.Expression =
"\"" + workspace_connection + "\" meta [IsParameterQuery=true, List={\"" + workspace_connection + "\"}, DefaultValue=\"" + workspace_connection + "\", Type=\"Text\", IsParameterQueryRequired=true]";

// Create a datamodel parameter
var DatamodelParameter = Model.AddExpression("" + datamodel_param + "");
DatamodelParameter.Kind = ExpressionKind.M;
DatamodelParameter.Expression =
"\"" + datamodel_name + "\" meta [IsParameterQuery=true, List={\"" + datamodel_name + "\"}, DefaultValue=\"" + datamodel_name + "\", Type=\"Text\", IsParameterQueryRequired=true]";

// Create a list for calculated columns to be translated into data columns
var columnsToBeTranslated = new List<Tuple<string, string, bool>>();

// Iterate through M tables
// Adjust partition to use the above defined custom fReadTable function
foreach (var t in Model.Tables)
{
if (Convert.ToString(t.SourceType) == "M" && t.Columns.Count > 0)
{
foreach (var p in t.Partitions)
{
// Set the partition expression to use the custom fReadTable function
p.Expression =
"let" +
"\n Source = " + function_name + "(\"" + t.Name + "\", #\"" + workspace_param + "\", #\"" + datamodel_param + "\")" +
"\nin" +
"\n Source";
}

foreach (var c in t.Columns.ToList()) // Convert to List to avoid modification while iterating
{
if (Convert.ToString(c.Type) == "Calculated")
{
// Add the calculated column to the list for translation
columnsToBeTranslated.Add(Tuple.Create(t.Name, c.Name, c.IsHidden));
// Delete the original calculated column
c.Delete();
}
}
}
}

// Translate the calculated columns into data columns
foreach (var column in columnsToBeTranslated)
{
var newColumn = Model.Tables[column.Item1].AddDataColumn(column.Item2);
newColumn.SourceColumn = column.Item2;
newColumn.IsHidden = column.Item3;
if (column.Item3) // set IsAvailableInMDX to false only if the original calculated column was hidden
{
newColumn.IsAvailableInMDX = false;
}
}

// Delete all "previously used" shared expressions
foreach (var exp in Model.Expressions.ToList())
{
// Exclude the custom function and parameter expressions from deletion
if (exp.Name != "" + function_name + "" && exp.Name != "" + workspace_param + "" && exp.Name != "" + datamodel_param + "")
{
exp.Delete();
}
}

With the convenient feature of saving C# scripts as macros in Tabular Editor 3, executing them with just two clicks from the TE3 user interface becomes effortless.

After running the script and applying the changes to the model, I encountered an issue when attempting to save the changes to the .pbix file. It appears that Power BI is “ignoring” the applied changes and refreshing the data model to its previous state, pointing to the real source data instead of the AAS/premium workspace model.

Unfortunately, It turns out that making changes to M expression or partitions of a model through external tools (Tabular Editor) while connected to the AAS instance in Power BI Desktop is not supported by Microsoft (official documentation).

However, there is a workaround suggested by Daniel Otykier who gave an invaluable suggestion of trying to workaround it by using Developer Mode (aka. Power BI Desktop Projects).

By saving the .pbix file as a .pbip file first and then connecting to the model.bim file from the Dataset folder, I managed to execute my C# script and save the model.

Once the changes are saved to the .bim file, I can refresh the data using the new data load approach that utilizes the AnalysisServices.Database function. This can be done by opening either the .pbix file or the definition.pbir file from the Report folder.

After the data refresh is complete, I can save it as a .pbit (Power BI Template) file. Unlike the original file, the template file only stores metadata and not the actual data. Therefore, there is no risk in sharing the template, as it will be populated with data only when a person with the appropriate permissions refreshes it according to the Row-Level Security settings.

Don’t forget to subscribe to

👉 Power BI Publication

👉 Power BI Newsletter

and join our Power BI community

👉 Power BI Masterclass

--

--

Mateusz Mossakowski
Microsoft Power BI

Business Intelligence Developer | Power BI | Tabular Editor | DAX Studio | T-SQL | Databricks | Data Modeling 🧩🔍💡