Ever wish to write real dynamic pipelines without limitation of static structure of YAML? Here is how …
Just use SparrowCI and it’s new feature called hub tasks.
Example — Fetch multiple http resources
tasks:
-
name: main
language: Raku
default: true
code: |
for config()<tasks><multifetch><state><> -> \i {
say "url: {i<url>} - [{i<status>}]";
}
depends:
-
name: multifetch
-
name: multifetch
language: Raku
code: |
if "{cache_root_dir()}/status.OK".IO ~~ :e {
update_state %( url => config()<url>, status => "OK" );
} else {
update_state %( url => config()<url>, status => "FAIL" );
}
init: |
run_task "fetch"
subtasks:
-
name: fetch
language: Bash
code: |
curl -fs $(config url) -o /dev/null \
&& touch $cache_root_dir/status.OK
echo $?
hub:
language: Raku
code: |
update_state %(
list => [
config => { url => "https://raku.org" },
config => { url => "https://raku.land" },
config => { url => "http://irclogs.raku.org" },
],
);
So, the idea is that hub
block is just a regular SparrowCI task that “creates” ( by calling update_state function) an execution iterator — so a template task gets executed thee times with different url parameter ( config()<url>
).
The task itself runs fetch subtask which is a tiny bash script that run curl command. At this time url is already known …
What is interesting that hub task result are accumulated across all the runs and available in the main task as config()<tasks><multifetch><state> list …
So, in other words hub task idea is similar to repeating some template task with different parameters … Well, in “static” YAML based DSLs it’s not possible to run the same task arbitrary number of times with parameters dynamically generated in programming code.
— -
That is it. Post your comments here if you like this feature …