Tag Archives | scripting language

Expect command and how to automate shell scripts like magic

In the previous post, we talked about writing practical shell scripts and we saw how it is easy to write a shell script. Today we are going to talk about a tool that does magic to our shell scripts, that tool is the Expect command or Expect scripting language. Expect command or expect scripting language is a language that talks with your interactive programs or scripts that require user interaction. Expect scripting language works by expecting input, then the Expect script will send the response without any user interaction. You can say that this tool is your robot which will automate your scripts.

Continue Reading →

If Expect command if not installed on your system, you can install it using the following command:

$ apt-get install expect

Or on Red Hat based systems like CentOS:

$ yum install expect

Expect Command

Before we talk about expect command, Let’s see some of the expect command which used for interaction:

spawn                  Starting a script or a program.

expect                  Waiting for program output.

send                      Sending a reply to your program.

interact                Allowing you in interact with your program.

  • The spawn command is used to start a script or a program like the shell, FTP, Telnet, SSH, SCP, and so on.
  • The send command is used to send a reply to a script or a program.
  • The Expect command waits for input.
  • The interact command allows you to define a predefined user interaction.

We are going to type a shell script that asks some questions and we will make an Expect script that will answer those questions.

First, the shell script will look like this:

#!/bin/bash

echo "Hello, who are you?"

read $REPLY

echo "Can I ask you some questions?"

read $REPLY

echo "What is your favorite topic?"

read $REPLY

Now we will write the Expect scripts that will answer this automatically:

#!/usr/bin/expect -f

set timeout -1

spawn ./questions

expect "Hello, who are you?\r"

send -- "Im Adam\r"

expect "Can I ask you some questions?\r"

send -- "Sure\r"

expect "What is your favorite topic?\r"

send -- "Technology\r"

expect eof

The first line defines the expect command path which is #!/usr/bin/expect.

On the second line of code, we disable the timeout. Then start our script using spawn command.

We can use spawn to run any program we want or any other interactive script.

The remaining lines are the Expect script that interacts with our shell script.

The last line if the end of file which means the end of the interaction.

Now Showtime, let’s run our answer bot and make sure you make it executable.

$ chmod +x ./answerbot

$./answerbot

expect command

Cool!! All questions are answered as we expect.

If you get errors about the location of Expect command you can get the location using the which command:

$ which expect

We did not interact with our script at all, the Expect program do the job for us.

The above method can be applied to any interactive script or program.Although the above Expect script is very easy to write, maybe the Expect script little tricky for some people, well you have it.

Using autoexpect

To build an expect script automatically, you can the use autoexpect command.

autoexpect works like expect, but it builds the automation script for you. The script you want to automate is passed to autoexpect as a parameter and you answer the questions and your answers are saved in a file.

$ autoexpect ./questions

autoexpect command

A file is generated called script.exp contains the same code as we did above with some additions that we will leave it for now.

autoexpect script

If you run the auto generated file script.exp, you will see the same answers as expected:

autoexpect script execution

Awesome!! That super easy.

There are many commands that produce changeable output, like the case of FTP programs, the expect script may fail or stuck. To solve this problem, you can use wildcards for the changeable data to make your script more flexible.

Working with Variables

The set command is used to define variables in Expect scripts like this:

set MYVAR 5

To access the variable, precede it with $ like this $VAR1

To define command line arguments in Expect scripts, we use the following syntax:

set MYVAR [lindex $argv 0]

Here we define a variable MYVAR which equals the first passed argument.

You can get the first and the second arguments and store them in variables like this:

set my_name [lindex $argv 0]

set my_favorite [lindex $argv 1]

Let’s add variables to our script:

#!/usr/bin/expect -f

set my_name [lindex $argv 0]

set my_favorite [lindex $argv 1]

set timeout -1

spawn ./questions

expect "Hello, who are you?\r"

send -- "Im $my_name\r"

expect "Can I ask you some questions?\r"

send -- "Sure\r"

expect "What is your favorite topic?\r"

send -- "$my_favorite\r"

expect eof

Now try to run the Expect script with some parameters to see the output:

$ ./answerbot SomeName Programming

expect command variables

Awesome!! Now our automated Expect script is more dynamic.

Conditional Tests

You can write conditional tests using braces like this:

expect {

"something" { send -- "send this\r" }

"*another" { send -- "send another\r" }

}

We are going to change our script to return different conditions, and we will change our Expect script to handle those conditions.

We are going to emulate different expects with the following script:

#!/bin/bash

let number=$RANDOM

if [ $number -gt 25000 ]; then

echo "What is your favorite topic?"

else

echo "What is your favorite movie?"

fi

read $REPLY

A random number is generated every time you run the script and based on that number, we put a condition to return different expects.

Let’s make out Expect script that will deal with that.

#!/usr/bin/expect -f

set timeout -1

spawn ./questions

expect {

"*topic?" { send -- "Programming\r" }

"*movie?" { send -- "Star wars\r" }

}

expect eof

expect command conditions

Very clear. If the script hits the topic output, the Expect script will send programming and if the script hits movie output the expect script will send star wars. Isn’t cool?

If else Conditions

You can use if/else clauses in expect scripts like this:

#!/usr/bin/expect -f

set NUM 1

if { $NUM < 5 } {

puts "\Smaller than 5\n"

} elseif { $NUM > 5 } {

puts "\Bigger than 5\n"

} else {

puts "\Equals 5\n"

}

if command

Note: The opening brace must be on the same line.

While Loops

While loops in expect language must use braces to contain the expression like this:

#!/usr/bin/expect -f

set NUM 0

while { $NUM <= 5 } {

puts "\nNumber is $NUM"

set NUM [ expr $NUM + 1 ]

}

puts ""

while loop

For Loops

To make a for loop in expect, three fields must be specified, like the following format:

#!/usr/bin/expect -f

for {set NUM 0} {$NUM <= 5} {incr NUM} {

puts "\nNUM = $NUM"

}

puts ""

for loop

User-defined Functions

You can define a function using proc like this:

proc myfunc { TOTAL } {

set TOTAL [expr $TOTAL + 1]

return "$TOTAL"

}

And you can use them after that.

#!/usr/bin/expect -f

proc myfunc { TOTAL } {

set TOTAL [expr $TOTAL + 1]

return "$TOTAL"

}

set NUM 0

while {$NUM <= 5} {

puts "\nNumber $NUM"

set NUM [myfunc $NUM]

}

puts ""

user-defined functions

Interact Command

Sometimes your Expect script contains some sensitive information that you don’t want to share with other users who use your Expect scripts, like passwords or any other data, so you want your script to take this password from you and continuing automation normally.

The interact command reverts the control back to the keyboard.

When this command is executed, Expect will start reading from the keyboard.

This shell script will ask about the password as shown:

#!/bin/bash

echo "Hello, who are you?"

read $REPLY

echo "What is you password?"

read $REPLY

echo "What is your favorite topic?"

read $REPLY

Now we will write the Expect script that will prompt for the password:

#!/usr/bin/expect -f

set timeout -1

spawn ./questions

expect "Hello, who are you?\r"

send -- "Hi Im Adam\r"

expect "*password?\r"

interact ++ return

send "\r"

expect "*topic?\r"

send -- "Technology\r"

expect eof

interact command

After you type your password type ++ and the control will return back from the keyboard to the script.

Expect language is ported to many languages like C#, Java, Perl, Python, Ruby and Shell with almost the same concepts and syntax due to its simplicity and importance.

Expect scripting language is used in quality assurance, network measurements such as echo response time, automate file transfers, updates, and many other uses.

I hope you now supercharged with some of the most important aspects of Expect command, autoexpect command and how to use it to automate your tasks in a smarter way.

Thank you.

0

30 Examples for Awk Command in Text Processing

In the previous post, we talked about sed command and we saw many examples of using it in text processing and we saw how it is good in this, but it has some limitations. Sometimes you need something powerful, giving you more control to process data. This is where awk command comes in. The awk command or GNU awk in specific provides a scripting language for text processing. With awk scripting language, you can make the following: Define variables, use string and arithmetic operators, use control flow and loops, generate formatted reports, actually, you can process log files that contain maybe millions of lines to output a readable report that you can benefit from.

Continue Reading →

Awk Options

The awk command is used like this:

awk options program file

Awk can take the following options:

-F fs To specify a file separator.

-f file To specify a file that contains awk script.

-v var=value To declare a variable.

We will see how to process files and print results using awk.

Read AWK Scripts

To define an awk script, use braces surrounded by single quotation marks like this:

awk '{print "Welcome to awk command tutorial "}'

awk command

If you type anything, it returns the same welcome string we provide.

To terminate the program, press The Ctrl+D. Looks tricky, don’t panic, the best is yet to come.

Using Variables

With awk, you can process text files. Awk assigns some variables for each data field found:

  • $0 for the whole line.
  • $1 for the first field.
  • $2 for the second field.
  • $n for the nth field.

The whitespace character like space or tab is the default separator between fields in awk.

Check this example and see how awk processes it:

awk '{print $1}' myfile

awk command variables

The above example prints the first word of each line.

Sometimes the separator in some files is not space nor tab but something else. You can specify it using –F option:

awk -F: '{print $1}' /etc/passwd

awk command passwd

This command prints the first field in the passwd file. We use the colon as a separator because the passwd file uses it.

Using Multiple Commands

To run multiple commands, separate them with a semicolon like this:

echo "Hello Tom" | awk '{$2="Adam"; print $0}'

awk multiple commands

The first command makes the $2 field equals Adam. The second command prints the entire line.

Reading The Script From a File

You can type your awk script in a file and specify that file using the -f option.

Our file contains this script:

{print $1 " home at " $6}

awk -F: -f testfile /etc/passwd

read from file

Here we print the username and his home path from /etc/passwd, and surely the separator is specified with capital -F which is the colon.

You can your awk script file like this:

{

text = $1 " home at " $6

print text

}

awk -F: -f testfile /etc/passwd

multiple commands

Awk Preprocessing

If you need to create a title or a header for your result or so. You can use the BEGIN keyword to achieve this. It runs before processing the data:

awk 'BEGIN {print "Report Title"}'

Let’s apply it to something we can see the result:

awk 'BEGIN {print "The File Contents:"}

{print $0}' myfile

begin command

Awk Postprocessing

To run a script after processing the data, use the END keyword:

awk 'BEGIN {print "The File Contents:"}

{print $0}

END {print "File footer"}' myfile

end command

This is useful, you can use it to add a footer for example.

Let’s combine them together in a script file:

BEGIN {

print "Users and thier corresponding home"

print " UserName \t HomePath"

print "___________ \t __________"

FS=":"

}

{

print $1 " \t " $6

}

END {

print "The end"

}

First, the top section is created using BEGIN keyword. Then we define the FS and print the footer at the end.

awk -f myscript /etc/passwd

complete script

Built-in Variables

We saw the data field variables $1, $2 $3, etc are used to extract data fields, we also deal with the field separator FS.

But these are not the only variables, there are more built-in variables.

The following list shows some of the built-in variables:

FIELDWIDTHS     Specifies the field width.

RS     Specifies the record separator.

FS     Specifies the field separator.

OFS  Specifies the Output separator.

ORS  Specifies the Output separator.

By default, the OFS variable is the space, you can set the OFS variable to specify the separator you need:

awk 'BEGIN{FS=":"; OFS="-"} {print $1,$6,$7}' /etc/passwd

builtin variables

Sometimes, the fields are distributed without a fixed separator. In these cases, FIELDWIDTHS variable solves the problem.

Suppose we have this content:

1235.96521

927-8.3652

36257.8157

awk 'BEGIN{FIELDWIDTHS="3 4 3"}{print $1,$2,$3}' testfile

field width

Look at the output. The output fields are 3 per line and each field length is based on what we assigned by FIELDWIDTH exactly.

Suppose that your data are distributed on different lines like the following:

Person Name

123 High Street

(222) 466-1234

Another person

487 High Street

(523) 643-8754

In the above example, awk fails to process fields properly because the fields are separated by newlines and not spaces.

You need to set the FS to the newline (\n) and the RS to a blank text, so empty lines will be considered separators.

awk 'BEGIN{FS="\n"; RS=""} {print $1,$3}' addresses

field separator

Awesome! we can read the records and fields properly.

More Variables

There are some other variables that help you to get more information:

ARGC     Retrieves the number of passed parameters.

ARGV     Retrieves the command line parameters.

ENVIRON     Array of the shell environment variables and corresponding values.

FILENAME    The file name that is processed by awk.

NF     Fields count of the line being processed.

NR    Retrieves total count of processed records.

FNR     The record which is processed.

IGNORECASE     To ignore the character case.

You can review the previous post shell scripting to know more about these variables.

Let’s test them.

awk 'BEGIN{print ARGC,ARGV[1]}' myfile

awk command arguments

The ENVIRON variable retrieves the shell environment variables like this:

$ awk '

BEGIN{

print ENVIRON["PATH"]

}'

data variables

You can use bash variables without ENVIRON variables like this:

echo | awk -v home=$HOME '{print "My home is " home}'

awk shell variables

The NF variable specifies the last field in the record without knowing its position:

awk 'BEGIN{FS=":"; OFS=":"} {print $1,$NF}' /etc/passwd

awk command NF

The NF variable can be used as a data field variable if you type it like this: $NF.

Let’s take a look at these two examples to know the difference between FNR and NR variables:

awk 'BEGIN{FS=","}{print $1,"FNR="FNR}' myfile myfile

awk command FNR

In this example, the awk command defines two input files. The same file, but processed twice. The output is the first field value and the FNR variable.

Now, check the NR variable and see the difference:

awk '

BEGIN {FS=","}

{print $1,"FNR="FNR,"NR="NR}

END{print "Total",NR,"processed lines"}' myfile myfile

awk command NR FNR

The FNR variable becomes 1 when comes to the second file, but the NR variable keeps its value.

User Defined Variables

Variable names could be anything, but it can’t begin with a number.

You can assign a variable as in shell scripting like this:

awk '

BEGIN{

test="Welcome to LikeGeeks website"

print test

}'

user variables

Structured Commands

The awk scripting language supports if conditional statement.

The testfile contains the following:

10

15

6

33

45

awk '{if ($1 > 30) print $1}' testfile

if command

Just that simple.

You should use braces if you want to run multiple statements:

awk '{

if ($1 > 30)

{

x = $1 * 3

print x

}

}' testfile

multiple statements

You can use else statements like this:

awk '{

if ($1 > 30)

{

x = $1 * 3

print x

} else

{

x = $1 / 2

print x

}}' testfile

awk command else

Or type them on the same line and separate the if statement with a semicolon like this:

else one line

While Loop

You can use the while loop to iterate over data with a condition.

cat myfile

124 127 130

112 142 135

175 158 245

118 231 147

awk '{

sum = 0

i = 1

while (i < 5)

{

sum += $i

i++

}

average = sum / 3

print "Average:",average

}' testfile

while loop

The while loop runs and every time it adds 1 to the sum variable until the i variable becomes 4.

You can exit the loop using break command like this:

awk '{

tot = 0

i = 1

while (i < 5)

{

tot += $i

if (i == 3)

break

i++

}

average = tot / 3

print "Average is:",average

}' testfile

awk command break

The for Loop

The awk scripting language supports the for loops:

awk '{

total = 0

for (var = 1; var < 5; var++)

{

total += $var

}

avg = total / 3

print "Average:",avg

}' testfile

for loop

Formatted Printing

The printf command in awk allows you to print formatted output using format specifiers.

The format specifiers are written like this:

%[modifier]control-letter

This list shows the format specifiers you can use with printf:

c              Prints numeric output as a string.

d             Prints an integer value.

e             Prints scientific numbers.

f               Prints float values.

o             Prints an octal value.

s             Prints a text string.

Here we use printf to format our output:

awk 'BEGIN{

x = 100 * 100

printf "The result is: %e\n", x

}'

awk command printf

Here is an example of printing scientific numbers.

We are not going to try every format specifier. You know the concept.

Built-In Functions

Awk provides several built-in functions like:

Mathematical Functions

If you love math, you can use these functions in your awk scripts:

sin(x) | cos(x) | sqrt(x) | exp(x) | log(x) | rand()

And they can be used normally:

awk 'BEGIN{x=exp(5); print x}'

math functions

String Functions

There are many string functions, you can check the list, but we will examine one of them as an example and the rest is the same:

awk 'BEGIN{x = "likegeeks"; print toupper(x)}'

string functions

The function toupper converts character case to upper case for the passed string.

User Defined Functions

You can define your function and use them like this:

awk '

function myfunc()

{

printf "The user %s has home path at %s\n", $1,$6

}

BEGIN{FS=":"}

{

myfunc()

}' /etc/passwd

user defined functions

Here we define a function called myprint, then we use it in our script to print output using printf function.

I hope you like the post.

Thank you.

0

Ansible tutorial * Automate your systems

In a previous tutorial, we talked about expect command and we saw how to automate scripts in Linux using its scripting language. Today, we will take a step further in our automation techniques and talk about a tool that automates tasks more professionally and for different platforms, this tool is Ansible. We will also talk about some Ansible features such as playbook, inventory, vault, role, and container. Ansible is an open source IT tool provided by Red-Hat Enterprise Linux (RHEL) that helps in configuration management, orchestration service, task and application deployment automation.

Continue Reading →

What is Ansible?

Ansible is an open source IT tool provided by Red-Hat Enterprise Linux (RHEL) that helps in configuration management, orchestration service, task and application deployment automation.

This tool is aimed to help system administrators who are seeking to minimize recurring tasks, seamless deployment, and easy automation.

Similar tools to Ansible are Puppet, SaltStack, and Chef which are the main configuration management tools available on the market.

Each one of these tools has its own advantages and disadvantages, so choosing the right one can be a bit challenging, depending on which features are needed or which programming language is preferred.

From the advantages of Ansible compared to other tools, Ansible is sort of a new tool that is built on Python and uses YAML templates for scripting its jobs.

YAML stands for “YAML Ain’t Markup Language” that is a very easy human-readable language. This helps new users to understand it easily.

Another advantage is to use Ansible there is no need to install an agent in the hosts which enhances the communication speed as it is using both push and pull models to send commands to its Linux nodes and for Windows nodes, the WinRM protocol is used.

As we stated above since it’s a new tool from its disadvantages is that it has a poor GUI, un-customized and immature platform when compared to other tools.

Even though Ansible considered to be used more frequently than ever and there is an increase in downloading it.

Ansible setup on Ubuntu

As we previously mentioned that it’s no need to install an agent in the hosts which is unlike other tools. For Ansible it’s a master node installation only which lacks background process, database dependency, and always running service and that makes it extremely light.

It is recommended to use the default package manager for Ubuntu while installing Ansible which will help to install the latest stable version.

Before starting the installation process and for the Linux package installation, you have to make sure that Python 2 (version 2.6 or later) or Python 3 (version 3.5 or later) is installed.

Even though that most of Linux OS package managers when asked to download Ansible will download the best Python version and its dependencies automatically.

And for the source installation, the development suite may be needed like the build-essential package for Ubuntu.

We can install Ansible on Ubuntu using one of the following two methods:

The first method through the Ubuntu package manager

First, add Ansible PPA for Ubuntu using the following command:

sudo apt-add-repository ppa:ansible/ansible

Second, press Enter to confirm the key server setup. Third, update the package manager using the following command:

sudo apt update

Fourth, Ansible is ready to be installed using the next command:

sudo apt install ansible

The second method of installing Ansible is from its source:

This method is sometimes helpful for users who need some particular requirements like for example you need to install the beta or development version of Ansible even if this may grant you early access to new features and future modules but also you need to be careful it is an unstable version that is still under development and testing. Also, this method is helpful if you don’t need to install Ansible through the package manager. So, to get the Ansible source package you can use one of the following techniques. First through downloading the .tar file: Download the .tar file

wget -c https://releases.ansible.com/ansible/ansible-2.6.0rc3.tar.gz

Unarchive it

tar -xzvf ./ansible-2.6.0rc3.tar.gz

Second through the GitHub source: But we will need to install first the git command

sudo apt install -y git

Then get Ansible

git clone https://github.com/ansible/ansible.git --recursive

After downloading the Ansible source by using one of the previous techniques we will start building Ansible but as we previously mentioned that we will need to install Python. So, we can use the following commands to install Python to make sure that Ansible requirements are met: Go to Ansible Source directory

cd ./ansible*

Install Python using easy_install

sudo easy_install pip

Install Python requirements

sudo pip install -r ./requirements.txt

Setup the environment in order to use Ansible

source ./hacking/env-setup

If you are using GitHub Source you can update the Ansible project and its submodules as following:

git pull --rebase
git submodule update --init --recursive

For every time you execute the previous step you will need to be sure that the environment is already set up properly through the next two commands:

echo "export ANSIBLE_HOSTS=/etc/ansible/hosts" >> ~/.bashrc
echo "source ~/ansible/hacking/env-setup" >> ~/.bashrc

Finally, the Ansible inventory can be usually found in /etc/ansible/hosts and its configuration file is usually found in /etc/ansible/ansible.cfg

Ansible master node configuration

Usually the Ansible configuration file (ansible.cfg) is located in /etc/ansible/ansible.cfg or in the home directory which belongs to the user who installed Ansible.

As soon as you have installed Ansible you can start using it with its default configuration. Next, we will be discussing the most important and useful Ansible configurations that will improve your Ansible work experience.

Starting from Ansible 2.4 and later “ansible-config” is a command that is used by Ansible users to list the enable Ansible options with their values.

Ansible configuration file is divided into several sections but in this article, we will only focus on [defaults] general section. So, Let’s have a look on this section basic parameters.

Using your favorite text editor (Gedit, vi, nano…) you can open the ansible.cfg configuration file:

sudo nano /etc/ansible/ansible.cfg

inventory: points out to the location of the inventory that Ansible uses to know the available hosts
inventory = /etc/ansible/hosts

roles_path: points out to the location where the Ansible playbook have to search for extra roles
roles_path = /etc/ansible/roles

log_path: points out to the location where Ansible log file is stored. Permission to write in this file should be given to Ansible user.
log_path = /var/log/ansible.log

retry_files_enabled: indicates the retry feature which allows Ansible to create a .retry file anytime a playbook fails. It’s recommended leaving this option disabled unless you really want it because if it is enabled it will create multiple files which will take space.
retry_files_enabled = False

host_keychecking: this parameter is used in constantly changing environments where old hosts machines are deleted and new hosts take their place. This parameter is usually used in a cloud or a virtualized environment.
host_key_checking = False

forks: Indicates the number of parallel tasks that can be executed to the client host. By default, its value is 5 and this to save system resources and network bandwidth but in case you have enough resources and a good bandwidth you can increase the number.
forks = 5

remote_port: contains the port number used by SSH on the hosts
remote_port = 22

nocolor: It gives you the ability to use different colors for Ansible playbook and tasks that shows errors and success.
nocolor = 0

Node Configuration for Linux client

OpenSSH-server is the only important and required a tool to be installed on the client node and by default, all new versions of Linux use SSH as the main remote access tool. So, you need to check the following points carefully:

  • SSH service is always up and running.
  • SSH port which is 22 by default should be allowed in the system’s firewall.

Node Configuration for Windows client

In order to make Ansible able to remotely manage Windows host the following applications should be installed on Windows nodes:

  • PowerShell version 3.0 or higher
  • .NET version 4.0

For missing requirements, there is an Ansible already made PowerShell script that can carry out this installation automatically you can find it in the following link https://github.com/jborean93/ansible-windows/blob/master/scripts/Upgrade-PowerShell.ps1

But before running the previous script you need to change the execution policy to be unrestricted by executing the following script and you need to run it with administrator privilege:

$link = "https://raw.githubusercontent.com/jborean93/ansible-windows/master/scripts/Upgrade-PowerShell.$script = "$env:temp\Upgrade-PowerShell.ps1"
$username = "Admin"

password = "secure_password"
(New-Object -TypeName System.Net.WebClient).DownloadFile($link, $script)
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Force
script -Version 5.1 -Username $username -Password $password -Verbose
Set-ExecutionPolicy -ExecutionPolicy Restricted -Force
$reg_winlogon_path = "HKLM:\Software\Microsoft\Windows NT\CurrentVersion\Winlogon"
Set-ItemProperty -Path $reg_winlogon_path -Name AutoAdminLogon -Value 0
Remove-ItemProperty -Path $reg_winlogon_path -Name DefaultUserName -ErrorAction SilentlyContinue
Remove-ItemProperty -Path $reg_winlogon_path -Name DefaultPassword -ErrorAction SilentlyContinue

After that run, the Ansible already made PowerShell script then run the execution policy script again to return it back to restricted.

Another important script that needs to be run to configure WinRM to make it up and running to listen to Ansible commands, this script is also Ansible already made and you can find it in the following link: https://github.com/ansible/ansible/blob/devel/examples/scripts/ConfigureRemotingForAnsible.ps1

Similarly, this script needed to be run under administrator privileges and execution policy to be unrestricted you can use the following piece of code:

$link = "https://raw.githubusercontent.com/ansible/ansible/devel/examples/scripts/ConfigureRemotingForAnsible.$script = "$env:temp\ConfigureRemotingForAnsible.ps1"

(New-Object -TypeName System.Net.WebClient).DownloadFile($link, $script)
powershell.exe -ExecutionPolicy ByPass -File $script

If no errors are thrown then this Ansible now should be able to manage this node.

YAML Basics

As we have previously mentioned that YAML is a human-friendly language that can be used to manage data. Next, we will talk about YAML basics and will show you how to write a code using YAML.

Guidelines to create a YAML file:

  • YAML uses spaces instead of tabs.
  • YAML is case sensitive
  • YAML file should be saved with the .yaml extension
  • YAML file sometimes starts with “—” and ends with “…” but it is optional.

Since YAML can be used to write Ansible playbooks so next we will show you how YAML is easy to use. So in the following example where we need to copy user configuration. If you are using an Ansible command it will look like that:

- name: Copy user configuration copy: src=/home/admin/setup.conf dest=/usr/local/projects/ owner=setup group=dev mode=0677 backup=yes

But in case you are using YAML it will be like this:
- name: Copy user configuration
copy:
src: /home/admin/setup.conf
dest: /usr/local/projects/
owner: setup
group: dev
mode: 0677
backup: yes

Another example is a .ini inventory file can be as following:
node0.lab.edu
[lab1servers]
node1.lab.edu
node2.lab.edu
[lab2servers]
node3.lab.edu

But in case you are using YAML it will look like this:
all:
hosts:
node0.lab.edu
children:
lab1servers:
hosts:
node1.lab.edu
node2.lab.edu
lab2server:
hosts:
node3.lab.edu

So from the previous two examples, you will find that YAML is easy to use, human-friendly, neat and good looking language.

Ansible Inventory

It is a .ini file that consists of records of IP addresses, hostnames of the host clients. It may also contain some other variables about the hosts.

In general, these file contents are organized in groups and each group has a name, this name is written between two square brackets like for example [Group1].

The location of Ansible inventory file is by default can be found in /etc/ansible/hosts. But it is recommended to put all the Ansible configuration files in a folder in the home directory of the user and this to allow the user to add and modify their configuration according to their needs. So, next is an example for opening the Ansible configuration file and setting the inventory:

sudo nano /etc/ansible/ansible.cfg
inventory = /home/user1/ansible/hosts

Also, you can choose an Ansible inventory file while executing a command by adding -i option to the command:
ansible -m ping -i ~/ansible/hosts

There are two Ansible inventory types static and dynamic. Static inventory can be used in small organizations which has small to medium infrastructure. While Dynamic inventory can be used in large organizations where there is a huge number of hosts, complicated tasks to be done and enormous errors may start to appear. If you are trying to add hosts with a similar style to Ansible inventory you use a counter block like the next example:

Inventory file with similar style hosts:
[servers]
node0.lec
node1.lec
node2.lec
node3.lec
node4.lec

Inventory using a counter block:
[servers]
Node[0:4].lec

Ansible Playbook

Ansible playbook simply is a systematic group of scripts which is using Ansible commands in a more organized method that can install and configure systems. Ansible playbook can perform the following tasks and delegate them to other servers:

  • Reorder multi-tier system roll-outs.
  • Applying application and systems patches.
  • Collecting data from client hosts and depending on the collected data it starts sending instant actions to servers, devices, and load balancers.

Ansible playbook is written in YAML which is a very simple human-readable language compared to other tradition coding languages. YAML also can allow users sharing their code in an easy way.

Ansible Roles

While converting Ansible playbooks into roles will give you the ability to change a set of configuration management tasks into reusable modules with multiple configurations which will be easily shared when needed.

Structure of Ansible role is very simple it consists of many folders each one of them consists of a lot of YAML files that by default have one main.yml file but they can have more than one file.

Ansible Vault

It is an Ansible encryption tool which allows users to encrypt various variables. This Ansible vault produces encrypted files to save variables those files can be moved to another location when needed.

Ansible vault can encrypt any different forms of data that are found in Ansible roles and playbooks. Also, it can encrypt task files in case you need to hide a variable name.

Ansible Container

It is an open source tool which allows users to automate everything about their containers from building to deployment to management. Ansible container allows better code management and implementing containers on any cloud registries.

By default, Ansible container is not installed from the beginning as a part of Ansible installation so you will need to set it up on a container host and during the installation process, you will need to choose a container engine to work on.

0

30 Examples for Awk Command in Text Processing

In the previous post, we talked about sed command and we saw many examples of using it in text processing and we saw how it is good in this, but it has some limitations. Sometimes you need something powerful, giving you more control to process data. This is where awk command comes in. The awk command or GNU awk in specific provides a scripting language for text processing. With awk scripting language, you can make the following: a) Define variables, b) Use string and arithmetic operators, c) Use control flow and loops, d) Generate formatted reports. Actually, you can process log files that contain maybe millions of lines to output a readable report that you can benefit from.

Continue Reading →

Awk Options

The awk command is used like this:

$ awk options program file

Awk can take the following options:

-F fs     To specify a file separator.

-f file     To specify a file that contains awk script.

-v var=value     To declare a variable.

We will see how to process files and print results using awk.

Read AWK Scripts

To define an awk script, use braces surrounded by single quotation marks like this:

$ awk '{print "Welcome to awk command tutorial "}'

If you type anything, it returns the same welcome string we provide.

To terminate the program, press The Ctrl+D. Looks tricky, don’t panic, the best is yet to come.

Using Variables

With awk, you can process text files. Awk assigns some variables for each data field found:

  • $0 for the whole line.
  • $1 for the first field.
  • $2 for the second field.
  • $n for the nth field.

The whitespace character like space or tab is the default separator between fields in awk.

Check this example and see how awk processes it:

$ awk '{print $1}' myfile

The above example prints the first word of each line.

Sometimes the separator in some files is not space nor tab but something else. You can specify it using –F option:

$ awk -F: '{print $1}' /etc/passwd

This command prints the first field in the passwd file. We use the colon as a separator because the passwd file uses it.

Using Multiple Commands

To run multiple commands, separate them with a semicolon like this:

$ echo "Hello Tom" | awk '{$2="Adam"; print $0}'

The first command makes the $2 field equals Adam. The second command prints the entire line.

Reading The Script From a File

You can type your awk script in a file and specify that file using the -f option.

Our file contains this script:

{print $1 " home at " $6}

$ awk -F: -f testfile /etc/passwd

Here we print the username and his home path from /etc/passwd, and surely the separator is specified with capital -F which is the colon.

You can your awk script file like this:

{

text = " home at "

print $1 $6

}

$ awk -F: -f testfile /etc/passwd

Awk Preprocessing

If you need to create a title or a header for your result or so. You can use the BEGIN keyword to achieve this. It runs before processing the data:

$ awk 'BEGIN {print "Report Title"}'

Let’s apply it to something we can see the result:

$ awk 'BEGIN {print "The File Contents:"}

{print $0}' myfile

Awk Postprocessing

To run a script after processing the data, use the END keyword:

$ awk 'BEGIN {print "The File Contents:"}

{print $0}

END {print "File footer"}' myfile

This is useful, you can use it to add a footer for example.

Let’s combine them together in a script file:

BEGIN {

print "Users and thier corresponding home"

print " UserName \t HomePath"

print "___________ \t __________"

FS=":"

}

{

print $1 " \t " $6

}

END {

print "The end"

}

First, the top section is created using BEGIN keyword. Then we define the FS and print the footer at the end.

$ awk -f myscript /etc/passwd

Built-in Variables

We saw the data field variables $1, $2 $3, etc are used to extract data fields, we also deal with the field separator FS.

But these are not the only variables, there are more built-in variables.

The following list shows some of the built-in variables:

FIELDWIDTHS     Specifies the field width.

RS     Specifies the record separator.

FS     Specifies the field separator.

OFS  Specifies the Output separator.

ORS  Specifies the Output separator.

By default, the OFS variable is the space, you can set the OFS variable to specify the separator you need:

$ awk 'BEGIN{FS=":"; OFS="-"} {print $1,$6,$7}' /etc/passwd

Sometimes, the fields are distributed without a fixed separator. In these cases, FIELDWIDTHS variable solves the problem.

Suppose we have this content:

1235.96521

927-8.3652

36257.8157

$ awk 'BEGIN{FIELDWIDTHS="3 4 3"}{print $1,$2,$3}' testfile

Look at the output. The output fields are 3 per line and each field length is based on what we assigned by FIELDWIDTH exactly.

Suppose that your data are distributed on different lines like the following:

Person Name

123 High Street

(222) 466-1234

Another person

487 High Street

(523) 643-8754

In the above example, awk fails to process fields properly because the fields are separated by new lines and not spaces.

You need to set the FS to the newline (\n) and the RS to a blank text, so empty lines will be considered separators.

$ awk 'BEGIN{FS="\n"; RS=""} {print $1,$3}' addresses

Awesome! we can read the records and fields properly.

More Variables

There are some other variables that help you to get more information:

ARGC     Retrieves the number of passed parameters.

ARGV     Retrieves the command line parameters.

ENVIRON     Array of the shell environment variables and corresponding values.

FILENAME    The file name that is processed by awk.

NF     Fields count of the line being processed.

NR    Retrieves total count of processed records.

FNR     The record which is processed.

IGNORECASE     To ignore the character case.

You can review the previous post shell scripting to know more about these variables.

Let’s test them.

$ awk 'BEGIN{print ARGC,ARGV[1]}' myfile

The ENVIRON variable retrieves the shell environment variables like this:

$ awk '

BEGIN{

print ENVIRON["PATH"]

}'

You can use bash variables without ENVIRON variables like this:

$ echo | awk -v home=$HOME '{print "My home is " home}'

The NF variable specifies the last field in the record without knowing its position:

$ awk 'BEGIN{FS=":"; OFS=":"} {print $1,$NF}' /etc/passwd

The NF variable can be used as a data field variable if you type it like this: $NF.

Let’s take a look at these two examples to know the difference between FNR and NR variables:

$ awk 'BEGIN{FS=","}{print $1,"FNR="FNR}' myfile myfile

In this example, the awk command defines two input files. The same file, but processed twice. The output is the first field value and the FNR variable.

Now, check the NR variable and see the difference:

$ awk '

BEGIN {FS=","}

{print $1,"FNR="FNR,"NR="NR}

END{print "Total",NR,"processed lines"}' myfile myfile

The FNR variable becomes 1 when comes to the second file, but the NR variable keeps its value.

User Defined Variables

Variable names could be anything, but it can’t begin with a number.

You can assign a variable as in shell scripting like this:

$ awk '

BEGIN{

test="Welcome to LikeGeeks website"

print test

}'

Structured Commands

The awk scripting language supports if conditional statement.

The testfile contains the following:

10

15

6

33

45

$ awk '{if ($1 > 30) print $1}' testfile

Just that simple.

You should use braces if you want to run multiple statements:

$ awk '{

if ($1 > 30)

{

x = $1 * 3

print x

}

}' testfile

Or type them on the same line and separate the if statement with a semicolon like this:

While Loop

You can use the while loop to iterate over data with a condition.

cat myfile

124 127 130

112 142 135

175 158 245

118 231 147

$ awk '{

sum = 0

i = 1

while (i < 5)

{

sum += $i

i++

}

average = sum / 4

print "Average:",average

}' testfile

The while loop runs and every time it adds 1 to the sum variable until the i variables becomes 4.

You can exit the loop using break command like this:

$ awk '{

tot = 0

i = 1

while (i < 5)

{

tot += $i

if (i == 3)

break

i++

}

average = tot / 3

print "Average is:",average

}' testfile

The for Loop

The awk scripting language supports the for loops:

$ awk '{

total = 0

for (var = 1; var < 5; var++)

{

total += $var

}

avg = total / 3

print "Average:",avg

}' testfile

Formatted Printing

The printf command in awk allows you to print formatted output using format specifiers.

The format specifiers are written like this:

%[modifier]control-letter

This list shows the format specifiers you can use with printf:

c              Prints numeric output as a string.

d             Prints an integer value.

e             Prints scientific numbers.

f               Prints float values.

o             Prints an octal value.

s             Prints a text string.

Here we use printf to format our output:

$ awk 'BEGIN{

x = 100 * 100

printf "The result is: %e\n", x

}'

Here is an example of printing scientific numbers.

We are not going to try every format specifier. You know the concept.

Built-In Functions

Awk provides several built-in functions like:

Mathematical Functions

If you love math, you can use these functions in your awk scripts:

sin(x) | cos(x) | sqrt(x) | exp(x) | log(x) | rand()

And they can be used normally:

$ awk 'BEGIN{x=exp(5); print x}'

String Functions

There are many string functions, you can check the list, but we will examine one of them as an example and the rest is the same:

$ awk 'BEGIN{x = "likegeeks"; print toupper(x)}'

The function toupper converts character case to upper case for the passed string.

User Defined Functions

You can define your function and use them like this:

$ awk '

function myfunc()

{

printf "The user %s has home path at %s\n", $1,$6

}

BEGIN{FS=":"}

{

myfunc()

}' /etc/passwd

Here we define a function called myprint, then we use it in our script to print output using printf function.

I hope you like the post.

Thank you.

likegeeks.com

0