parse expect_output in a variable : TCL Data Structure

Sometimes we have to automate our steps through expect, run some commands in remote machine and capture the output of a command in a variable and use that variable in some other task.

So here is an example, how we can do that. The output from expect is always captured in expect_output(buffer) and we have to parse this to get our expected result.

So first we have store this expect_output(buffer) in a variable and which will have multiple lines along with our expected result.

Now we have to split that variable with "\n" as delimiter , which will create an array with all the lines in it.

Again from that array we can use indexing to extract the result from a certain position.

Here is one example.

[code lang=’bash’]
#!/usr/bin/expect
set password somepass
set cmd “ls -Art /var/lib/docker/path_to_files/ | tail -n 1”

spawn ssh root@10.59.1.150
set prompt “#|%|>|\\\$ $”
expect {
“(yes/no)” {send “yes\r”;exp_continue}
“password: ” {send “$password\r”;exp_continue}
-re $prompt
}
send “$cmd\r”
expect “# ”

set outcome [split $expect_out(buffer) “\n”]
set filename [lindex $outcome 1]

expect eof
puts “##########################”
puts $filename
puts “##########################”
[/code]

python : trace line number with exception

In python we raise exception to catch if anything goes wrong, but sometime with try block we have multiple lines of code and we are not able to track which line exactly throwing the exception.

There is way to catch the line number from which the exception is coming.

try this way:

[code language=”python”]

try:

line1

line2

except Exception as E:

print(‘Error on line {}’.format(sys.exc_info()[-1].tb_lineno), type(E).__name__, E)

[/code]

Now for any exception, you will get the line number for that also.

pexpect alternative in python for remote connection

We generally use python pexpect module to connect system remotely with ssh and execute our tasks. But sometimes pexpect module is not found to be installed in remote systems which create problems. And this problem can be solved with the python select module with poll.

Here is the sample code that can be used.

https://github.com/kumarprd/pexpect-alternate

SNMP Poller tool to monitor any thing on network

Few years ago, I had created one SNMP Poller tool using perl and snmp utilites that can poll OID informations from any network devices, which is kind of passive monitoring mechanism.

Thought to make it Opensource under GNU GPL.

The details of the utility with its usage can be found here, if anyone is interested to use it.

https://github.com/kumarprd/snmp-poller

chef knife tricks: Add a node in an environment

 

Sometime during automation of a large deployment process, we have to bootstrap a node , create environment and add the node in that particular environment on the fly.

 

  1. Bootstraping :

[code language=”ruby”]

knife bootstrap  myhost09vmf0209.in.ram.com -x root -P password -N node2

[/code]

2. Create environment dynamically from inside the programme:(python here)

[code language=”python”]

## Create an envtemplate with required values in it

envtemplate = “””

{
“name”: \””””+envname+”””\”,
“description”: “The master development branch”,
“cookbook_versions”: {
},
“json_class”: “Chef:Environment”,
“chef_type”: “environment”,
“default_attributes”: {
“revrec”:
{
“required_version”: \””””+appVersion+”””\”

}
},
“override_attributes”: {
}
}

##write the envtemplate in a file

with open(“/tmp/”+envname+”.json”,”w”) as f:
f.write(envtemplate)
f.close()

## Create env from the teplate json file

subprocess.call(“knife environment from file /tmp/”+envname+”.json”, shell=True)

[/code]

3. Add the node in the environment:

[code language=”ruby”]

knife exec -E ‘nodes.find(“name:node2”) {|n| n.chef_environment(“env209”);n.save }’

[/code]

 

Nagios Plugin Developed@NagiosExchange

Long ago, while working in one of the previous organization, there were lots of components like services and servers running in production environment. I had deployed all products one by one from scratch and the count kept on increasing. There were components like PLM Servers, DB Server, License Mgmt, internal portal, Cotainer based virtualization system and a lots.

But there was no proper tools to monitor all the components at a time. As the count kept increasing , it becomes difficult to keep an eye on UP/DOWN time of all.

So I decided to deploy Nagios Monitoring system in the Data Center and developed many plugins to use.

I have opensourced few of the plugins, which I thought can help other people in world, those may facing these kind of challenges.

Also I posted them on Nagios Exchange on 4 years ago and now they are huge success. They each are downloaded 50k+ times  and I received many thanks from many people from around the world and feel happy.

They can be found from here: https://exchange.nagios.org/directory/Owner/divyaimca/1

Chef Recipe: Oracle DB 11gR2 EE silent deploy

Chef provides a lot of flexibility and greater choice for infrastructure automation and I prefer it over others.

We should design our recipe in such a way that the our recipes without being modified can be used in any environment by maximizing the use of  attributes.

I was working on a deployment project on Linux x86-64 platform, where I had to automate all the infra components. Oracle 11g R2 EE is one of them. I will share the cookbook  here that can help many other. The recipes written here are used for silent installation of the DB using a response file after pulling the media files from a remote system.

Also the recipes are made idempotent, so that rerunning the cookbook again and again never do any damage. It automatically sets an attribute for DB installed / DB running in chef server after a successful compile -> run of the recipes.

Also the username/passwords are pulled stored and pulled from Encrypted Databag to make it more secure.

Here is the cookbook : https://github.com/kumarprd/Ora11gR2-EE-Silent-Install-Chef-Recipe

The recipes involved use below steps in sequence :

  1. setupenv.rb (It create the environment that will be used by rest of the recipes)
  2. oradb.rb (It checks the default attributes to fresh install/patch install and go further for any operations)
  3. install_oradb.rb ( Install the oracle database in ideompotent manner and sets the attributes in the server)
  4. create_schema.rb (This is application specific, but I will provide the template that can be modifed)

NOTE : Here create an encryoted databag with below json props  which are accessed inside recipes.

Follow  my other post : https://thegnulinuxguy.com/2016/08/09/chef-create-encrypted-data-bag-and-keep-secrets/

{

“id”: “apppass”,
“ora_db_passwd”: “dbpass”,
“oracle_pass”: “orapass”

}

Any issue/suggestion are welcome.

Python : Inplace update json and maintain proper order

Some time we have to read one existing json property file and  update some values inplace.

If we don’t use proper approach, the update may lead to breaking the json structure in the file.

We have to hook the json objects by using OrderedDict of collection module in python for remembering the proper order.

Here old_value is updated with new_value :

“head”:

{

“name” : “old_value”

}

 

[code language=”python”]

from collections import OrderedDict

propJson = os.path.dirname(os.path.abspath(__file__))+/props.json
if os.path.isfile(propJson):
with open(propJson,r+) as f:
prop = json.load(f, object_hook=OrderedDict)
prop[head][name] = str(new_value)
f.seek(0)
f.write(json.dumps(prop, f, default=str, indent=4))
f.truncate()

[/code]

Using optparse in python

Sometimes we have to create tools that takes input as argument with certain options. We can create such tool with optparse module of python.

Here is a small example of using this.

 

[code language=”python”]

from optparse import OptionParser

parser = OptionParser(usage=’usage: %prog [options] arguments’)

parser.add_option(‘-a’,help="setup/cleanup",action="store", dest="action")
parser.add_option(‘-m’,help="email id",action="store", dest="email")
parser.add_option(‘-i’,help="Input json props",action="store", dest="input")
(options, args) = parser.parse_args()

[/code]

For help, type: ( This will display all the arguments that can be used with their format)

python tool.py -h

Usage: tool.py [options] arguments

Options:
-h, –help show this help message and exit
-a ACTION setup/cleanup
-m EMAIL email id
-i INPUT Input json props

save it in a programme and execute it as :

python tool.py -a setup -i file.json -m pdk@pdk.com

 

Now we can access the above inputs and use them, using below variables inside the programme:

[code language=”python”]

options.input

options.email

options.action

[/code]

 

Send mail using Python’s smtplib module

Python has a built in module to send mail to recipient[s] as to,cc,bcc. Here assumption is that : the smtp is configured in localhost (where the script will run).

[code language=”python”]
import socket
import smtplib
from email.mime.text import MIMEText

def SendMail(file,Email,status):
fp = open(file,’rb’)
msg = MIMEText(fp.read())
fp.close()
to=Email
cc=’def@example.com’
bcc=’123@example.com’
msg[‘Subject’] = ‘MULTINODE SETUP :: ‘+status
msg[‘From’] = ‘abc@example.com’
msg[‘to’] = to
msg[‘cc’] = cc

msg[‘bcc’] = bcc
toaddr=to.split(",")+cc.split(",")+bcc.split(",")
s = smtplib.SMTP(‘localhost’)
s.sendmail(‘something@example.com’,toaddr ,msg.as_string())
s.quit()

file=’/u01/work/tmp/sidtest’
Email=’myname@example.com’
status=’testing’
SendMail(file,Email,status)
[/code]