Adjusting overall partitions definitions logic through C# script in Tabular Editor challenge

Mateusz Mossakowski
5 min readDec 3, 2023

--

This time, instead of providing a solution, I have a question for you. Recently, I’ve been focusing on a citizen development approach for Power BI users within my organization. I’ve noticed that relying solely on a live connection to existing semantic models isn’t always sufficient because users often require additional tables, such as targets or mappings. Therefore, I’m interested in enabling them to import data from the centrally governed semantic models into their solutions.

To avoid burdening the semantic models used for “core” reporting, my plan is to replicate these models in a separate premium workspace. This way, people can query the data from there without overloading the models used for reporting. Since these premium workspace models can be treated as Analysis Services (AAS) models, I can use relatively simple M code to wrap up DAX queries and enable querying.

Side note: Originally, the plan was to utilize dataflows for the citizen development approach. However, due to the lack of row-level security support in dataflows and the additional effort required for citizen developers to build relationships and measures on their own, I decided to rely on the premium workspace semantic models instead. By leveraging these models, access restrictions based on company rules can be enforced, ensuring that citizen developers can only access data within their authorized boundaries. This approach prioritizes data security while still empowering users to work with the data they are authorized to access.

Next step was to create another one-to-one copy of the semantic model. However, there is a key difference: instead of pulling data from the original sources, the data is retrieved directly from the premium workspace semantic model using the AnalysisServices.Database function. To facilitate this, I wrote a custom M function called fReadTable, which enables the retrieval of data from the tables in the premium workspace model. This function pulls the data in a one-to-one manner. Additionally, it cleans up the technical column names that include the table name and column name enclosed in square brackets so that the final column names stay untouched.

(_vTableName as text, _vWorkspaceCon as text, _vDatamodel as text) =>

let
sourceTable = AnalysisServices.Database(
_vWorkspaceCon,
_vDatamodel,
[
Query = "
EVALUATE
'" & _vTableName & "'
"
]
),
columnNames = Table.ColumnNames(sourceTable),
removeCharacters = (columnName as text) =>
Text.Replace(Text.Replace(columnName, _vTableName & "[", ""), "]", ""),
cleanedColumnNames = List.Transform(columnNames, each removeCharacters(_)),
outputTable = Table.RenameColumns(sourceTable, List.Zip({columnNames, cleanedColumnNames}))
in
outputTable

To ensure consistency, the M codes for all import mode tables should be switched to the below template. To implement this template, it is necessary to create two parameters: one for the workspace connection and another for the semantic model name.

let
Source = fReadTable("CAL DIM", #"01_Workspace_Connection", #"02_Datamodel_Name")
in
Source

Now, you might be wondering when I’ll ask my question. Well, the time has come! While the number of tables in these data models isn’t excessive, it can still be a bit time-consuming and prone to mistakes or overlooking something. So, I had this idea of automating the process a bit. I thought about coding everything in a C# script, which could be used as a macro in Tabular Editor 3. This way, it could be reused for any existing or future semantic models.

// define workspace connection
string workspace_connection = "powerbi://api.powerbi.com/v1.0/myorg/My-Premium-Workspace";
// define data model name
string datamodel_name = "My Data Model";
// define function name
string function_name = "fReadTable";
// define workspace parameter name
string workspace_param = "01_Workspace_Connection";
// define datamodel parameter name
string datamodel_param = "02_Datamodel_Name";

// create a custom fReadTable function
// it is meant to create a query to pull the data from the premium workspace data model acting as AAS database
// and to remove not needed prefixes and suffixes for column names so that they stay "unchanged"
var CustomFunction = Model.AddExpression("" + function_name + "");
CustomFunction.Kind = ExpressionKind.M;
CustomFunction.Expression = "(_vTableName as text, _vWorkspaceCon as text, _vDatamodel as text) =>\n\nlet\n sourceTable = AnalysisServices.Database(\n _vWorkspaceCon,\n _vDatamodel,\n [\n Query = \"\n EVALUATE\n '\" & _vTableName & \"'\n \"\n ]\n ),\n columnNames = Table.ColumnNames(sourceTable),\n removeCharacters = (columnName as text) =>\n Text.Replace(Text.Replace(columnName, _vTableName & \"[\", \"\"), \"]\", \"\"),\n cleanedColumnNames = List.Transform(columnNames, each removeCharacters(_)),\n outputTable = Table.RenameColumns(sourceTable, List.Zip({columnNames, cleanedColumnNames}))\nin\n outputTable";

// create a workspace parameter
var WorskpaceParameter = Model.AddExpression("" + workspace_param + "");
WorskpaceParameter.Kind = ExpressionKind.M;
WorskpaceParameter.Expression = "\"" + workspace_connection + "\" meta [IsParameterQuery=true, List={\"" + workspace_connection + "\"}, DefaultValue=\"" + workspace_connection + "\", Type=\"Text\", IsParameterQueryRequired=true]";

//create a datamodel parameter
var DatamodelParameter = Model.AddExpression("" + datamodel_param + "");
DatamodelParameter.Kind = ExpressionKind.M;
DatamodelParameter.Expression = "\"" + datamodel_name + "\" meta [IsParameterQuery=true, List={\"" + datamodel_name + "\"}, DefaultValue=\"" + datamodel_name + "\", Type=\"Text\", IsParameterQueryRequired=true]";

// iterate through all non calculated tables
// add sufix _TBD to each partition
// create new adjusted partition to use the above defined custom fReadTable function
// then delete all "old" partitions that have _TBD suffix
foreach(var t in Model.Tables) {
if (Convert.ToString(t.SourceType) == "M") {
foreach(var p in t.Partitions) {
if (p.Expression != "" & p.Name != "Partition") {
p.Name = p.Name + "_TBD";
}
}
var newPartition = Model.Tables["" + t.Name + ""].AddMPartition("" + t.Name + "_adjusted_partition");
newPartition.Expression = "let\r\n Source = " + function_name + "(\"" + t.Name + "\", #\"" + workspace_param + "\", #\"" + datamodel_param + "\")\r\nin\r\n Source";
}
}

foreach(var p in Model.AllPartitions.ToList()) {
if (p.Name.Substring(p.Name.Length - 4) == "_TBD") {
p.Delete();
}
}

// delete all "previously used" shared expressions other than custom function, workspace connection parameter and semantic model parameter
foreach(var exp in Model.Expressions.ToList()) {
if (exp.Name != "" + function_name + "" & exp.Name != "" + workspace_param + "" & exp.Name != "" + datamodel_param + "") {
exp.Delete();
}
}

After running the script and applying the changes to the model, I encountered an issue when attempting to save the changes to the .pbix file. It appears that Power BI is “ignoring” the applied changes and refreshing the data model to its previous state, pointing to the real source data instead of the AAS/premium workspace model.

For further testing purposes, I deployed the modified semantic model to the premium workspace using the XMLA endpoint to observe its behavior. Then I attempted to force a full refresh of a single partition which resulted in an error related to a custom data connector.

Object Description Progress Duration
Error "{""error"":{""code"":""Premium_ASWL_Error"",""pbi.error"":{""code"":""Premium_ASWL_Error"",""parameters"":{},""details"":[{""code"":""Premium_ASWL_Error_Details_Label"",""detail"":{""type"":1,""value"":""The data source '<ccon>{\""protocol\"":\""x-datasource\"",\""address\"":{\""kind\"":\""UnknownCallsite\"",\""path\"":\""UnknownCallsite\""}}</ccon>' depends on custom data connector and is not discovered by Power BI services. Please follow steps at https://aka.ms/powerbiCustomConnectorConfig to finish the configuration of on-premise data gateway and data source.""}}],""exceptionCulprit"":1}}}

I guess there may be an issue when transitioning from one method of querying the data (such as SQL Server database or Databricks tables) to another (AAS-like querying premium workspace model). This could potentially require reauthentication or the reapplication of credentials, but this is just my speculation.

So now I’m totally stuck here and in need for some Sherlock-level hints or genius insights to crack this investigation! Or, you know, if the universe has conspired against me and this whole thing is just a lost cause due to reasons X, Y, and Z, then I can finally wave the white flag and move on. And sleep better 😁

--

--

Mateusz Mossakowski

Business Intelligence Developer | Power BI | Tabular Editor | DAX Studio | T-SQL | Databricks | Data Modeling 🧩🔍💡