{"id":2529,"date":"2012-04-28T23:11:27","date_gmt":"2012-04-28T22:11:27","guid":{"rendered":"http:\/\/www.devco.net\/?p=2529"},"modified":"2012-04-28T23:23:02","modified_gmt":"2012-04-28T22:23:02","slug":"trigger-puppet-runs","status":"publish","type":"post","link":"https:\/\/www.devco.net\/archives\/2012\/04\/28\/trigger-puppet-runs.php","title":{"rendered":"Trigger Puppet runs though Git hooks"},"content":{"rendered":"

Since shutting down my business I now run a small 25 node network with no Puppet Masters and I do not schedule regular Puppet runs – I run them just when needed.<\/p>\n

Till now I’ve just done puppet runs via MCollective, basically I’d edit some puppet files and after comitting them just send off a puppet run with mcollective, supplying filters by hand so I only trigger runs on the appropriate nodes.<\/p>\n

I started looking into git commit hooks to see if I can streamline this. I could of course just trigger a run on all nodes after a commit, there is no problem with capacity of masters etc to worry about. This is not very elegant so I thought I’d write something to parse my git push and trigger runs on just the right machines.<\/p>\n

I’ll show a simplified version of the code here, the full version of the post-receive hook can be found here<\/a>. I’ve removed the parse_hiera<\/em>, parse_node<\/em> and parse_modules<\/em> functions from this but you can find them in the code linked to. To use this code you will need MCollective 2.0.0 that is due in a few days.<\/p>\n

<\/p>\n

\r\n#!\/usr\/bin\/env ruby\r\n\r\nrequire 'rubygems'\r\nrequire 'grit'\r\nrequire 'mcollective'\r\n\r\ninclude MCollective::RPC\r\n\r\n@matched_modules = []\r\n@matched_nodes = []\r\n@matched_facts = []\r\n\r\n# read each git ref in the push and process them\r\nwhile msg = gets\r\n  old_sha, new_sha, ref = msg.split(' ', 3)\r\n\r\n  repo = Grit::Repo.new(File.join(File.dirname(__FILE__), '..'))\r\n\r\n  commit = repo.commit(new_sha)\r\n\r\n  case ref\r\n    when %r{^refs\/heads\/(.*)$}\r\n      branch = $~[1]\r\n      if branch == \"master\"\r\n        puts \"Commit on #{branch}\"\r\n        commit.diffs.each do |diff|\r\n          puts \"    %s\" % diff.b_path\r\n\r\n          # parse the paths and save them to the @matched_* arrays\r\n          # these functions are in the full code paste linked to above\r\n          case diff.b_path\r\n            when \/^hieradb\/\r\n              parse_hiera(diff.b_path)\r\n            when \/^nodes\/\r\n              parse_node(diff.b_path)\r\n            when \/^common\\\/modules\/\r\n              parse_modules(diff.b_path)\r\n            else\r\n              puts \"ERROR: Do not know how to parse #{diff.b_path}\"\r\n          end\r\n        end\r\n      else\r\n        puts \"Commit on non master branch #{branch} ignoring\"\r\n      end\r\n  end\r\nend\r\n\r\nunless @matched_modules.empty? && @matched_nodes.empty? && @matched_facts.empty?\r\n  puppet = rpcclient(\"puppetd\")\r\n\r\n  nodes = []\r\n  compound_filter = []\r\n\r\n  nodes << @matched_nodes\r\n\r\n  # if classes or facts are found then do a discover\r\n  unless @matched_modules.empty? && @matched_facts.empty?\r\n    compound_filter << @matched_modules << @matched_facts\r\n\r\n    puppet.comound_filter compound_filter.flatten.uniq.join(\" or \")\r\n\r\n    nodes << puppet.discover\r\n  end\r\n\r\n  if nodes.flatten.uniq.empty?\r\n    puts \"No nodes discovered via mcollective or in commits\"\r\n    exit\r\n  end\r\n\r\n  # use new mc 2.0.0 pluggable discovery to supply node list\r\n  # thats a combination of data discovered on the network and file named\r\n  puppet.discover :nodes => nodes.flatten.uniq\r\n\r\n  puts\r\n  puts \"Files matched classes: %s\" % @matched_modules.join(\", \") unless @matched_modules.empty?\r\n  puts \"Files matched nodes: %s\" % @matched_nodes.join(\", \") unless @matched_nodes.empty?\r\n  puts \"Files matched facts: %s\" % @matched_facts.join(\", \") unless @matched_facts.empty?\r\n  puts\r\n  puts \"Triggering puppet runs on the following nodes:\"\r\n  puts\r\n  puppet.discover.in_groups_of(3) do |nodes|\r\n    puts \"   %-20s %-20s %-20s\" % nodes\r\n  end\r\n\r\n  puppet.runonce\r\n\r\n  printrpcstats\r\nelse\r\n  puts \"ERROR: Could not determine a list of nodes to run\"\r\nend\r\n<\/pre>\n

<\/code><\/p>\n

The code between lines 14 and 46 just reads each line of the git post-receive hook STDIN and process them, you can read more about these hooks @ git-scm.com<\/a>.<\/p>\n

For each b path in the commit I parse its path based on puppet module conventions, node names, my hiera structure and some specific aspects of my file layouts. These end up in the @matched_modules<\/em>, @matched_nodes<\/em> and @matched_facts<\/em> arrays.<\/p>\n

MCollective 2.0.0 will let you supply node names not just from network based discovery but from any source really. Here I get node names from things like my node files, file names in iptables rules and such. Version 2.0.0 also supports a new query language for discovery which we use here. The goal is to do a network discovery only when I have non specific data like class names – if I found just a list of node names I do not need to do go out to the network to do discovery thanks to the new abilities of MCollective 2.0.0 <\/p>\n

In lines 48 to 90 I create a MCollective client to the puppetd agent, discover matching nodes and do the puppet runs.<\/p>\n

If I found any code in the git push that matched either classes or facts I need to do a full MCollective discover based on those to get a node list. This is done using the new compound filtering language, the filter will look something like:<\/p>\n

<\/p>\n

\r\n\/some_class\/ or some::other::class or fact=value\r\n<\/pre>\n

<\/code><\/p>\n

But this expensive network wide discovery is only run when there are facts or classes matched out of the commit.<\/p>\n

Line 72 will supply the combined MCollective discovered nodes and node names discovered out of the code paths as discovery data which later in line 85 will get used to trigger the runs.<\/p>\n

The end result of this can be seen here, the commit matched only 5 out of my 25 machines and only those will be run:<\/p>\n

<\/p>\n

\r\n$ git push origin master\r\nCounting objects: 13, done.\r\nDelta compression using up to 4 threads.\r\nCompressing objects: 100% (6\/6), done.\r\nWriting objects: 100% (7\/7), 577 bytes, done.\r\nTotal 7 (delta 4), reused 0 (delta 0)\r\nremote: Commit on master\r\nremote:     common\/modules\/mcollective\/manifests\/client.pp\r\nremote:\r\nremote: Files matched classes: mcollective::client\r\nremote:\r\nremote: Triggering puppet runs on the following nodes:\r\nremote:\r\nremote:    node1                node2            node3\r\nremote:    node4                node5\r\nremote:\r\nremote: 5 \/ 5\r\nremote:\r\nremote: Finished processing 5 \/ 5 hosts in 522.15 ms\r\nTo git@git:puppet.git\r\n   7590a60..10ee4da  master -> master\r\n<\/pre>\n

<\/code><\/p>\n","protected":false},"excerpt":{"rendered":"

Since shutting down my business I now run a small 25 node network with no Puppet Masters and I do not schedule regular Puppet runs – I run them just when needed. Till now I’ve just done puppet runs via MCollective, basically I’d edit some puppet files and after comitting them just send off a […]<\/p>\n","protected":false},"author":2,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","footnotes":""},"categories":[7],"tags":[94,78,21,13],"_links":{"self":[{"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/posts\/2529"}],"collection":[{"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/comments?post=2529"}],"version-history":[{"count":21,"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/posts\/2529\/revisions"}],"predecessor-version":[{"id":2551,"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/posts\/2529\/revisions\/2551"}],"wp:attachment":[{"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/media?parent=2529"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/categories?post=2529"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.devco.net\/wp-json\/wp\/v2\/tags?post=2529"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}