Quantcast
Channel: Ask Puppet: Puppet DevOps Q&A Community - RSS feed
Viewing all 257 articles
Browse latest View live

hiera interpolation

$
0
0
I have module called mdwcfg which is used for hiera configuration: mdwcfg/hiera.yaml: --- version: 5 defaults: datadir: data data_hash: yaml_data hierarchy: - name: "wld-pil-03" path: "wld-pil-03.yaml" - name: "wld-pil-04" path: "wld-pil-04.yaml" - name: "jdk" path: "jdk.yaml" - name: "common" path: "common.yaml" mdwcfg/data/wld-pil-04.yaml: --- domain_env: 'pil' domain_number: '04' mdwcfg::domain_env: &domain_env 'pil' mdwcfg::domain_number: &domain_number '04' mdwcfg::domain_name: &domain_name "wld-%{mdwcfg::domain_env}-%{mdwcfg::domain_number}" and I create a test file /tmp/kk.pp include mdwcfg $a = lookup('mdwcfg::domain_name') notify{"a: ${a}":} puppet can't interpolate the variable mdwcfg:domain_name /opt/puppetlabs/bin/puppet apply /tmp/kk.pp --modulepath=/etc/puppetlabs/code/environments/production/modules/ --debug .... Debug: Facter: resolving Xen facts. Debug: Evicting cache entry for environment 'production' Debug: Caching environment 'production' (ttl = 0 sec) Debug: importing '/etc/puppetlabs/code/environments/production/modules/mdwcfg/manifests/init.pp' in environment production Debug: Automatically imported mdwcfg from mdwcfg into production Warning: Defining "data_provider": "hiera" in metadata.json is deprecated. It is ignored since a 'hiera.yaml' with version >= 5 is present (in /etc/puppetlabs/code/environments/production/modules/mdwcfg/metadata.json) Warning: Module 'mdwcfg': Hierarchy entry "wld-pil-04" must use keys qualified with the name of the module Warning: Module 'mdwcfg': Hierarchy entry "wld-pil-04" must use keys qualified with the name of the module Warning: Undefined variable 'domain_env'; (file & line not available) Warning: Undefined variable 'domain_number'; (file & line not available) Debug: Automatic Parameter Lookup of 'mdwcfg::domain_name Searching for "lookup_options" Global Data Provider (hiera configuration version 5) No such key: "lookup_options" Module "mdwcfg" Data Provider (hiera configuration version 5) Using configuration "/etc/puppetlabs/code/environments/production/modules/mdwcfg/hiera.yaml" Merge strategy hash Hierarchy entry "wld-pil-03" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-03.yaml" Original path: "wld-pil-03.yaml" No such key: "lookup_options" Hierarchy entry "wld-pil-04" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-04.yaml" Original path: "wld-pil-04.yaml" No such key: "lookup_options" Hierarchy entry "jdk" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/jdk.yaml" Original path: "jdk.yaml" No such key: "lookup_options" Hierarchy entry "common" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/common.yaml" Original path: "common.yaml" No such key: "lookup_options" Searching for "mdwcfg::domain_name" Global Data Provider (hiera configuration version 5) No such key: "mdwcfg::domain_name" Module "mdwcfg" Data Provider (hiera configuration version 5) Using configuration "/etc/puppetlabs/code/environments/production/modules/mdwcfg/hiera.yaml" Hierarchy entry "wld-pil-03" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-03.yaml" Original path: "wld-pil-03.yaml" No such key: "mdwcfg::domain_name" Hierarchy entry "wld-pil-04" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-04.yaml" Original path: "wld-pil-04.yaml" Interpolation on "wld-%{mdwcfg::domain_env}-%{mdwcfg::domain_number}" Global Scope Global Scope Found key: "mdwcfg::domain_name" value: "wld--" Debug: Lookup of 'mdwcfg::domain_name' Searching for "lookup_options" Global Data Provider (hiera configuration version 5) No such key: "lookup_options" Module "mdwcfg" Data Provider (hiera configuration version 5) Using configuration "/etc/puppetlabs/code/environments/production/modules/mdwcfg/hiera.yaml" Merge strategy hash Hierarchy entry "wld-pil-03" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-03.yaml" Original path: "wld-pil-03.yaml" No such key: "lookup_options" Hierarchy entry "wld-pil-04" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-04.yaml" Original path: "wld-pil-04.yaml" No such key: "lookup_options" Hierarchy entry "jdk" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/jdk.yaml" Original path: "jdk.yaml" No such key: "lookup_options" Hierarchy entry "common" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/common.yaml" Original path: "common.yaml" No such key: "lookup_options" Searching for "mdwcfg::domain_name" Global Data Provider (hiera configuration version 5) No such key: "mdwcfg::domain_name" Module "mdwcfg" Data Provider (hiera configuration version 5) Using configuration "/etc/puppetlabs/code/environments/production/modules/mdwcfg/hiera.yaml" Hierarchy entry "wld-pil-03" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-03.yaml" Original path: "wld-pil-03.yaml" No such key: "mdwcfg::domain_name" Hierarchy entry "wld-pil-04" Path "/etc/puppetlabs/code/environments/production/modules/mdwcfg/data/wld-pil-04.yaml" Original path: "wld-pil-04.yaml" Interpolation on "wld-%{mdwcfg::domain_env}-%{mdwcfg::domain_number}" Global Scope Global Scope Found key: "mdwcfg::domain_name" value: "wld--" Notice: Compiled catalog for vm-lab-linux-1.msc.es in environment production in 0.15 seconds Debug: Creating default schedules Debug: Loaded state in 0.00 seconds Debug: Loaded state in 0.00 seconds Debug: Loaded transaction store file in 0.00 seconds Info: Applying configuration version '1497262320' Notice: a: wld-- Notice: /Stage[main]/Main/Notify[a: wld--]/message: defined 'message' as 'a: wld--' Debug: /Stage[main]/Main/Notify[a: wld--]: The container Class[Main] will propagate my refresh event Debug: Class[Main]: The container Stage[main] will propagate my refresh event Debug: Finishing transaction 31423700 Debug: Storing state Debug: Stored state in 0.01 seconds Notice: Applied catalog in 0.05 seconds Debug: Applying settings catalog for sections reporting, metrics Debug: Finishing transaction 34689140 Debug: Received report to process from vm-lab-linux-1.msc.es Debug: Evicting cache entry for environment 'production' Debug: Caching environment 'production' (ttl = 0 sec) Debug: Processing report from vm-lab-linux-1.msc.es with processor Puppet::Reports::Store Am I doing something wrong? thanks in advance, Raúl

Overwriting Puppet params.pp variables with Hiera

$
0
0
I want to overwrite some default variables in the params.pp file in a Puppet module. The section I am looking to do this on is an if statement: if $::osfamily == 'RedHat' { $defaultsiteconfig = { 'appname' => "${app_appname}", 'Organization' => "${app_organization}", 'WebPath' => "/opt/${package}${package_maj_version}", 'WebPort' => "${app_web_port}", 'DatabaseType' => "${app_database_type}", } } The variables: ${package} ${package_maj_version} ${app_web_port} ${app_database_type} are working fine since they are being pulled from higher up in the params.pp file. The same is happening with: ${app_appname} ${app_organization} (They come up blank since they are set to _undef_). But, I want them to pull from hiera instead: app::app_appname: "example.org" app::app_organization: "example.org" Is there something in Puppet that prevents this from occurring in the params.pp file since other manifests can pull those hiera variables without issue?

Trouble with facts/hiera after puppet upgrade

$
0
0
Hello, I'm attempting to upgrade puppet 3.8 -> 4.10 (just on vagrant right now). I'm having trouble getting my hiera paths to resolve. ### facts.yaml ``` myfact: engineering ``` ### hiera.yaml --- version: 5 hierarchy: # Most specific to least specific - name: "Yaml lists" datadir: /etc/puppetlabs/code/environments/%{::environment}/hieradata data_hash: yaml_data paths: - nodes/%{facts.myfact}.yaml ### Command puppet lookup --facts /vagrant/facts.yaml --hiera_config=/vagrant/modules/puppet/files/hiera.yaml --merge deep --environment some_environment --explain --compile classes Searching for "classes" Global Data Provider (hiera configuration version 5) Using configuration "/vagrant/modules/puppet/files/hiera.yaml" Hierarchy entry "Yaml lists" Path "/etc/puppetlabs/code/environments/some_environment/hieradata/nodes/.yaml" Original path: "nodes/%{facts.myfact}.yaml" Path not found Function lookup() did not find a value for the name 'classes ### Problem Notice how the path didn't fillin "myfact" at all. I've also tried where "myfact" is a symbol in the fact.yaml and I'm getting the same result. This behavior is also happening in `puppet apply` (which was working before upgrade). You'll notice that %{::environment} *is* working correctly though.

Hiera: Unable to pass parameters to nodes after declaring class (hiera_include).

$
0
0
Hi, I'm trying to move a class declaration out of the main manifest and use Hiera instead, on a new Dev environment we recently set up. I'm using hiera\_include('classes'), as I am looking at using Hiera to declare said classes and that works, the specific nodes pull the config from Puppet when I run the Puppet agent. What's not working for me is passing class parameters from Hiera, after declaring them. It just uses the default values. Specifically, I am trying to pass a new value for cassandra::datastax\_repo::descr and cassandra::datastax\_repo::pkg\_url . Manifest code below: class cassandra::datastax_repo ( $descr = 'DataStax Repo for Apache Cassandra', $key_id = '7E41C00F85BFC1706C4FFFB3350200F2B999A372', $key_url = 'http://debian.datastax.com/debian/repo_key', $pkg_url = undef, $release = 'stable', Do I need to add some extra config somewhere else for this to work?? For more information, I am working this module: [locp-cassandra](https://forge.puppet.com/locp/cassandra/0.4.0) and testing declaring the cassandra::datastax_repo class. site.pp node 'default' { hiera_include('classes') } hiera.yaml --- :backends: - yaml :yaml: :datadir: "/etc/puppetlabs/code/environments/%{environment}/hieradata" :hierarchy: - nodes/%{trusted.certname} - projects/%{project} - common hieradata --- classes: - cassandra::datastax_repo cassandra::datastax_repo: descr: Test repo pkg_url: http://test.url

command line hiera not work after switch to hiera5 hiera.yaml

$
0
0
Hi, I am switching to hiera5. My opensource puppet master and nodes are working fine and are able to consume the hiera data, but I would like to be able to run `hiera` from the command line on my puppet master to QA my hiera data. With the previous version of hiera I had a shell script that execute a bunch of hiera commands. For example it would execute: /opt/puppetlabs/bin/hiera controller_vip_name '::pod_prefix=fab-aos' '::pod_role=controller' '::ipaddress_eth0=123.456.789.012' '::fqdn=fake-fqdn.example.com' '::processorcount=fake-processor-count' environment=production The problem is that the above command and all of the commands executed by my script return `nil`. If I run the above command on my puppet master with debug I get this output: DEBUG: 2017-06-21 16:54:29 +0000: Hiera YAML backend starting DEBUG: 2017-06-21 16:54:29 +0000: Looking up controller_vip_name in YAML backend DEBUG: 2017-06-21 16:54:29 +0000: Ignoring bad definition in :hierarchy: 'nodes/' DEBUG: 2017-06-21 16:54:29 +0000: Looking for data source common DEBUG: 2017-06-21 16:54:29 +0000: Cannot find datafile /etc/puppetlabs/code/environments/production/hieradata/common.yaml, skipping I have these hiera.yaml files on my puppet master: # cat /etc/puppetlabs/puppet/hiera.yaml --- :backends: - yaml :hierarchy: - "nodes/%{::trusted.certname}" - common :yaml: # datadir is empty here, so hiera uses its defaults: # - /etc/puppetlabs/code/environments/%{environment}/hieradata on *nix # - %CommonAppData%\PuppetLabs\code\environments\%{environment}\hieradata on Windows # When specifying a datadir, make sure the directory exists. :datadir: and a larger one: # cat /etc/puppetlabs/code/environments/production/hiera.yaml --- version: 5 defaults: data_hash: yaml_data # Use the built-in YAML backend. datadir: "/etc/puppetlabs/code/environments/%{environment}/hieradata" hierarchy: - name: "Eyaml Data" lookup_key: eyaml_lookup_key options: pkcs7_private_key: /etc/puppetlabs/puppet/keys/private_key.pkcs7.pem pkcs7_public_key: /etc/puppetlabs/puppet/keys/public_key.pkcs7.pem paths: - "nodes/%{::trusted.certname}.yaml" - "nodes/%{::pod_role}.yaml" ... many more paths omitted for brevity .... I believe I have to most up-to-date software installed: # puppet -V 4.10.4 # hiera --version 3.3.2

Hiera configuration file had wrong type

$
0
0
Hi, I'm getting this error when doing puppet apply (I'm running masterless puppet). I was using Hiera 4 configuration in hiera.yaml but I couldnt find configuration for eyaml for that version. I switched configuration file version to hiera 3 but I see these errors now. Without upgrading my puppet agent, is there a way to fix my issue ? Error: The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, entry 'hierarchy' index 0 expected a Struct value, got String The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, entry 'hierarchy' index 1 expected a Struct value, got String The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, entry 'hierarchy' index 2 expected a Struct value, got String The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, expected a value for key 'version' The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, unrecognized key 'backends' The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, unrecognized key 'eyaml' The Hiera Configuration at '/etc/puppetlabs/code/environments/production/modules/mymodule/hiera.yaml' had wrong type, unrecognized key 'yaml' Here is my VM with packages installed Agent version: 4.5.3 Hiera: 3.2.0 Puppet gems installed *** LOCAL GEMS *** bigdecimal (1.2.4) deep_merge (1.0.1) facter (3.3.0) hiera (3.2.0) hiera-eyaml (2.1.0) highline (1.6.21) hocon (0.9.3) io-console (0.4.3) json (1.8.1) minitest (4.7.5) net-ssh (2.9.2) psych (2.0.5) puppet (4.5.3) rake (10.1.0) rdoc (4.1.0) semantic_puppet (0.1.2) stomp (1.3.3) test-unit (2.1.9.0) trollop (2.1.2)

fully-qualified variable fail with error '...is not a hash or array when accessing it with class_name'

$
0
0
Hello, I have the following modules and hiera and failed with error '...is not a hash or array when accessing it with class_name' when using fully-qualified variable $foo::config::writer. If I use just $writer, it will be fine. I think this is a scoping issue but I am not sure where is it, any thoughts? **Error: Could not retrieve catalog from remote server: Error 400 on SERVER: foo::config::writer is not a hash or array when accessing it with class_name at /etc/puppet/environments/production/modules/foo/manifests/config.pp:10 on node example.com Warning: Not using cache on failed catalog Error: Could not retrieve catalog; skipping run** init.pp ---- class foo { include foo::params case $foo::params::application { 'bar': { foo::config { 'bar': query_array => [ $foo::metrics::default_metrics, $foo::metrics::bar_metrics], } } default: { foo::config { 'default': query_array => $foo::metrics::default_metrics, writer => $foo::params::writer, interval => '60', } } } } config.pp --- define foo::config ( $writer = hiera_hash('foo::writer'), $interval = hiera('foo::interval', '120'), ) { include foo validate_hash($writer) $output = { class => $foo::config::writer['class'], attributes => $foo::config::writer['attributes'], } } node.yaml --- foo::queries: '' foo::writer: class: 'class2' attributes: fileName: '/tmp/zoo' showTimeStamp: 'false' foo::interval: '60'

issue with Hiera 5 used for node classification

$
0
0
I had this working using an older version hiera and puppet 3 where i added `hiera_include('classes')` to the site.pp to match all nodes. I successfully configure hiera 5 on a new build and I replaced that with the new lookup function `lookup('classes', Array[String], 'unique').include` yet now it fails to apply the classes specified in the hierachy e.g. common.yaml --- classes: - roles::default I have confirmed using `puppet lookup classes --node node1.example.com` that the lookup is working it returns: --- - roles::default I understand that their are many variables at play here what i am looking for is any glaring deficiencies i.e. what am i missing? anyone else have this working correctly?

Where is the "right" place to store files/templates that can be referenced by hiera?

$
0
0
In my use case, I'd like to use different motd templates and configure them in hiera: common.yaml: --- motd::template: 'motd-common.erb' Where, exactly, should I put 'motd-common.erb' though?

Where can I put files/templates where hiera can reference them?

$
0
0
(Sorry, I accidentally accepted the answer to my last question and it wasn't correct) Let's say I have a module "mymodule" which use a template to create /etc/mymodule.conf. I would like the user to be able to define this template themselves. Let's say my user wants a different template file the "Linux" and "SunOS" OS families. "mymodule" exposes the class parameter "$template" which is just a path to the template to use. I'm defining all my classes via hiera, so my site.pp is hiera_include('classes') Now, in hiera I have the following: Linux.pp: classes: - mymodule mymodule::template: 'profile/mymodule/mymodule.conf-Linux.epp Solaris.pp: classes: -mymodule mymodule::template: 'profile/mymodule/mymodule.conf-SunOS.epp This works fine. But the only reason it works is because I'm using a fake class called "profile" in my module path to store all of my files and templates: /profile/templates/mymodule/ This seems like a hack. Is there a better way to reference or store templates like this?

hiera_hash not merging

$
0
0
I can't seem to get hiera_hash to actually merge my hashes. Here's my setup: init.pp (just the relevant stuff) class kong ( $api_config, ) inherits kong::params { class {'kong::install': package => $package, version => $version, } $hiera_hash_api_config = hiera_hash('kong::api_config') class {'kong::config': api_config => $hiera_hash_api_config, } config.pp class kong::config( $api_config = $kong::api_config, ) { file { "${post_install_path}/kong-init.py": ensure => file, content => template('kong/kong-init.py.erb'), owner => $kong_user, group => $kong_group, mode => '0755', require => File[$post_install_path], } And here are the YAML files hieradata/app_land/application/dev.yaml kong::api_config: api1: upstream_url: 'url' api2: upstream_url: 'url' api3: upstream_url: 'url' hieradata/application/kong.yaml kong::api_config: api1: name: "api1" strip_uri: "True" unsecured: methods: "DELETE, POST, GET, OPTIONS" plugins: cors: config.methods: "DELETE, POST, GET" config.allowed_origin: "%{hiera('kong::allowed_origin')}" userinfo: methods: "GET, OPTIONS" plugins: cors: config.methods: "GET" config.allowed_origin: "%{hiera('kong::allowed_origin')}" acl: config.whitelist: "PUBLIC" api2: name: "api2" strip_uri: "True" unsecured: methods: "GET" plugins: cors: config.methods: "GET" config.allowed_origin: "%{hiera('kong::allowed_origin')}" acl: config.whitelist: "PRIVATE" key-auth: config.hide_credentials: "True" secured: methods: "POST, DELETE" plugins: cors: config.methods: "GET" config.allowed_origin: "%{hiera('kong::allowed_origin')}" acl: config.whitelist: "PRIVATE" api3: name: "api3" strip_uri: "True" default: methods: "OPTIONS, GET, POST, DELETE, PUT" group: "PUBLIC" plugins: cors: config.methods: "GET, POST, DELETE, PUT" config.allowed_origin: "%{hiera('kong::allowed_origin')}" clients: methods: "GET" group: "PRIVATE" plugins: cors: config.methods: "GET" config.allowed_origin: "%{hiera('kong::allowed_origin')}" and hiera.yaml hierachy: :hierarchy: - unique/%{::unique} - app_land%{::application}/%{::landscape} - application/%{::application} When I run puppet, the end result is the upstream_url defined in the dev.yaml appears fine, but all of the values in kong.yaml are undefined. I can't seem to figure out why. It had been working fine when api1, api2 and api3 each had their own config hash, and had 'hiera_hash' applied to them. But now that I've combined them, it seems to fail. I'm fairly new to puppet, so apologies if im missing something obvious. I just can't seem to wrap my head around whats wrong

Create Ressources from hash with variable environments

$
0
0
Hi there, im struggling in creating ressources from hash. I have multiple domainforrest with several domains each. my admins have accounts in one domain in that forrest but are mostly connecting from a central server via ssh and use ssh keys to avoid the passwords. I want to deploy this key on each machine via puppet. i do not have a 1:1 connection between environment and domain but i have a fact, which results in a generic string which shows me which domaincredential to use. Each admin should have his sshkey as authorized_key in his personal account. furthermore ALL admin-sshkeys shall be added to a 'scripting' account which can be used for special tasks and which is a local user. The aim is to have a central place (hiera) where i can configure admins and their ssh-keys. Idea: Hiera: admins: admin1: devdomain: devaccount1 testdomain: testaccount1 productiondomain: prodaccount1 ssh: admin2: devdomain: devaccount2 testdomain: testaccount2 productiondomain: prodaccount1 ssh: My Manifest looks something like this class ssh_admin_key( Hash $admins = {} ) { $admins.each |String key, Hash value| { ssh_authorized_key { '$key': ensure => present, user => pick($value['$::my_domainidentifier'], type => 'ssh-rsa', key => pick($value['ssh']), } . ssh_authorized_key { 'script_$key': ensure => present, user => script, type => 'ssh-rsa', key => pick($value['ssh']), } } } for failure tolerance the user accounts will be controlled by puppet too. Im quite sure the code above has some mistakes, since i haven't testet it until now. but the complete thing seems to me a little overcomplexed (hierahash/environmentspecific/customfact) and I would like to ask if there is a "simpler" approach or if im "overengineering" this. thanks.

accessing hiera-hash with subkey using a fact

$
0
0
Hi, Using opensource puppet 4.10 im trying to access the subkey of a hiera-hash in my puppetmodule using the subkey method. I have to access the subkey depending on a customfact (customenvironment, which is a identifier for a domain-forrest). The goal is to have the correct username for a list of admins depending on the environment while only writing them down once (in hiera-data). **init.pp** (takes a hash called $admins, defaults to a hiera-hash) [...] $admins.each |String $key, Hash $value| { $username = $value.$::customenvironment } puppet parser validate throws a Syntaxerror at the subkey. furthermore i have an hiera-hash in hiera which looks like that **common.yaml** ----- admins: adminname1: env1: env1username1 env2: env2username1 env3: env3username1 adminname2: env1: env1username2 env2: env2username2 env3: env3username2 env1, env2 ... are the values my customenvironment shows. in the end i want to access the correct username in each "environment" for the admin. there is no possibility to do this through "casing" and substitution of strings, since the names usernames have nothing in common. Any idea how i can access the subkey dependend on the fact? i would like to avoid stuff like this: if ($customenvironment == 'env1') { $c_env = $value.env1 } elsif ($custonenvironment == 'env2') { $c_env = $value.env2 } [...]

how to convert this derdanne/nfs puppet code to hiera?

$
0
0
I have installed the derdanne/nfs module and have the following code that is working: class { '::nfs': #ensure => absent, server_enabled => true, nfs_v4 => false, client_enabled => true, nfs_v4_client => false, #nfs_v4_export_root => '/', } nfs::server::export { '/localview-ssf01': clients => '192.168.40.0/23(rw,root_squash) 192.168.44.0/24(rw,root_squash) 172.16.239.0/24(rw,root_squash)' } I would really like to move most of that to hiera to node yaml file. I didn't see an example of nfsv3 exports for hiera in the module doc and i'm relatively new at hiera. Below is what is currently in my node.yaml file. This isn't the entire file, just the nfs relevant sections. There is some other network and sudo related stuff above it which is working ok: classes: - nfs nfs::server::enabled: true nfs::client::enabled: true nfs::nfs_v4: false nfs::nfs_v4_client: false nfs::server::export: '/localview-ssf01': 'clients' : '192.168.40.0/23(rw,root_squash) 192.168.44.0/24(rw,root_squash) 172.16.239.0/24(rw,root_squash)' I think the part that isn't working is where i'm trying to define the file system to export from this server. If anyone can show me an nfsv3 example of this I would be most appreciative. I'm not getting any errors from the puppet run but /etc/exports isn't getting populated with the entry either. It does get populated when I run the puppet code above. Gene

Problem in creating hiera for my module

$
0
0
Hi Team, **aoa_agent_installation/manifests/splunk_install.pp** class aoa_agent_installation::agent_install(String $myowner, String $mygroup, String $myuser, String $myhome) { Now i want all the 4 variables ( myowner, mygroup, myuser, myhome) to be defined in a single hiera file. All these variables values differs per hosts/servers/nodes. Now how can i have a single hiera file that my class agent_install.pp should look for variables values for all the attached nodes/servers. Regards, Rohith

Passing Consul Token as Fact Into Hiera

$
0
0
I am converting one of my masterless modules to use Consul. How do I use external facts to pass in the Consul host and Consul token? These change in every environment and are not managed by Puppet. I am using mod 'lynxman-hiera_consul', '0.1.2' Before my masterless run I export some facts > export FACTER_CONSULHOST=consul-randomid..us-west-2.elb.amazonaws.com>> export FACTER_MYTOKEN=some-token I can test this works with > facter mytoken; puppet facts --debug|grep mytoken>> facter consulhost;puppet facts --debug|grep consulhost My hiera.yaml looks like this [Hiera.yaml Gist](https://gist.github.com/eric-aldinger/55979bd00be8edc635e6d28da8bb6a60). Works fine if I replace the fact interpolation with strings. With the basic issue being with the fact interpolation on line 15 > :token: "%{facts.mytoken}" This is my example manifest for testing this [Consul.pp Gist](https://gist.github.com/eric-aldinger/2cb3a0e0825bff978395a1a90c0be7f6)

hiera mongodb schema

$
0
0
I'm looking for information on hiera with a mongodb backup and what is the setup with regards to the schema. Also should there be a different database for each hierarchy (such as environment) or are they combined into one database, and how would that be configured. Thanks in advance for any information

Using hiera in testing environments

$
0
0
HI I have puppet 3.6.2 and hiera 1.3.4 I have a manifest file, where I added new 'efs' site_mount_type hiera file in production environment profiles::site_mount_type: - 'efs' efs::primary_server: - '10.245.108.173' While running puppet agent I get this error. > Error: Could not retrieve catalog from remote server: Error 400 on SERVER: Unknown site_mount_type. Received efs at Manifest file in testing environment: $efs_server = hiera('efs::primary_server') $site_type = hiera('profiles::site_mount_type', 'nfs-aws') case $site_type { 'efs': { efs::mount { '/EFS/saas_data': alias => "mount-${::customerid}_saas-share", server => "${efs_server}:/c${::customerid}_saas_data", rw => true, mounted => true, repo => false, } ... default: { fail("Unknown site_mount_type. Received ${site_type}") } My assumption is, that hiera is not associated with test environment. I haven't deployed this environment and can't figure out how my predecessor was testing manifests in testing environment. This is what I found. ll -ah /etc/puppet/hiera/test/ total 16K drwxr-x---. 2 puppet puppet 4.0K Aug 14 07:11 . drwx------. 9 puppet puppet 4.0K Aug 11 12:36 .. -rw-r-----. 1 puppet puppet 623 Aug 13 2015 hiera.yaml -rwxr-x---. 1 puppet puppet 3.6K Aug 13 2015 runtest This is runtest file I found in /etc/puppet/hiera/test folder. [C:\fakepath\runtest.PNG](/upfiles/15026964303089304.png) Can anyone figure out how my predecessor was using testing environment with hiera ?

Hiera not picking up packages resource

$
0
0
Hi guys, I am experiencing with Puppet 4 and have the following issue: In my **site.pp** manifest i have defined the following default node: node default { include(hiera_array('classes', [])) } In my **common.yaml** file i have entered the following: --- classes: - motd packages: - htop The motd class is picked up just fine by my single node but the htop package does not. But when I add the " htop " package resource to my **site.pp** it install no problem: node default { include(hiera_array('classes', [])) package { 'htop': ensure => 'present', } } How can i make hiera install the package htop? I am using Puppet 4.10.6 Thanks

Installed puppet 5 and agent is not picking up the changes

$
0
0
Hello, I installed puppet 5 on the master and agent and signed the certificate for the agent, and all worked fine, now that i want a simple Hiera 5 test, i exactly followed the official docs from puppet at: https://docs.puppet.com/puppet/5.0/hiera_quick.html to create a simple module "profile" that creates "hiera_test.txt" under /tmp ...And when i try to run a test from the agent: # puppet agent -t --environment production Info: Using configured environment 'production' Info: Retrieving pluginfacts Info: Retrieving plugin Info: Caching catalog for xxxxx.domain Info: Applying configuration version '1504009389' Notice: Applied catalog in 0.04 seconds But the hiera_test.txt is never created. Can anyone shed some light on what might be wrong here? Thanks in advance.
Viewing all 257 articles
Browse latest View live