11 Mar

Using Docker Swarm? You gonna love Ansible 2.8!

Ansible 2..8 introduces huge update for Docker Swarm modules

The Ansible is kind of an icon of automation platform. Owned by RedHat but available as a free product on the public license. Developed both by RedHat employees and the community. Docker itself is an icon of containerization. If you use containers you know that automation is the key to simplify management of dockerized infrastructure. In Ansible 2.x many modules covering the docker operations has been introduced, but Docker Swarm was not really covered. However, it is going to change soon! There is a huge update coming with Ansible 2.8 release!

Lately, I started missing some features in Ansible that will allow me to perform some operations on Docker Swarm clusters. And I try to avoid using the command or shell modules as much as possible – running CLI commands on a remote host is like asking for troubles. I decided to fill this gap by myself and as a result, I can say I am now the Ansible community developer, author, and maintainer of a few modules – docker_swarm_facts, docker_node_facts, docker_host_facts, docker_node; and co-author of docker_swarm and the ansible.docker.swarm library.

In next posts, I wanna show you how those modules work. However, if you are looking forward using them I strongly advise you to give them a try now and report any bugs you find so we can fix them before Ansible 2.8 release. You can find them in the devel branch of Ansible repository.

Read More
14 Nov

Should vendors ask for a license to enable the API?

My friend let me use one of his older, unused servers in his data center so I have a place to run some virtual machines for my automation projects. As you may expect there is no SLA for this server, no redundancy, limited storage. For me, it is still better to have lab running 24/7 there than using my desktop PC or pay for resources in the public cloud. The server is running ESXi 6.0 hypervisor and have a free license installed. 

The latest release of new VMware modules for Ansible was a trigger to develop a playbook that will let me back up the virtual machine from the datastore on a remote server to my home storage. Quite simple and straightforward idea lead to some unexpected problems and questions – Where should be the limit of features available in free licenses? Should automation be blocked, or at least some tasks, in free features?

Read More
05 Nov

Ansible can’t read some facts from Juniper devices

Juniper automation with Ansible

It is really amazing how fast Ansible is developed lately. Stable versions are released more often and contain more changes required by IT professionals. Many of them fill the gaps between two worlds – the developers and operations engineers. Unfortunately, some modules are not catching up as fast as they should which causes problems in developing simple tasks. I experienced such when I was working on playbook example required for my latest press articles for ‘IT Professional’ magazine. The default Ansible junos_facts module couldn’t correctly read JunOS version on some devices. Usually on devices running the older firmware release. This can be a real problem if some tasks execution depends on the firmware version on the router or switch.

Besides the official modules and lots of roles available on Ansible Galaxy repository many vendors developed their own modules and let them use for free. In many cases, it should be considered a better, more secure approach as long as the vendor repository is still maintained. In my situation it was the easiest workaround of my problem.

Read More
28 Aug

Let Jenkins build the Jenkins image

Jenkins and Docker - flexible environment

Properly designed and implemented automation system requires its own infrastructure. You need to place components like code and configuration repository, agents responsible for executing the automated tests, acceptance tests software and tool to define automation processes. A leader in the last category is Jenkins. You can use it to automate the building, testing and deployment processes. Many of Jenkins features are available as plugins. In my previous post, I recommended running the infrastructure components as Docker containers. That includes Jenkins as well. You can use official images available on Docker Hub, but very soon you will find those images are missing many components, so you need to make your own. Of course not manually! Let Jenkins build the Jenkins image!

Read More
20 Aug

Run Jenkins in the ​container

Jenkins and Docker - flexible environment

The more you work with automation, the more you will like the containers. They fit and scale correctly in CI/CD model and can be easily managed. The whole infrastructure for automation should be flexible, easy to maintain and extendable – containers fit perfectly into this model. So why not start from putting Jenkins in the container?

Read More
13 Apr

Use Ansible to check SNMP on remote device

Best way to learn Ansible and the whole idea of automation is to start from the small playbooks and then grow big. If you first automate simple tasks, even those that may be easier and quicker to perform from the command line, you will learn how Ansible is working. Let’s say, we want to test if SNMP is responding on the remote host (we name it HostA). We will use SNMPv3 and authPriv security model. And of course, we want to write the Ansible playbook and run it on server HostNOC.

Read More

28 Jul

AWS Lambda guide part IV – API Gateway and Lambda without S3

AWS Lambda Tutorial, I will show you how to create or import your Python application to Lambda, use S3 bucket, add S3 trigger for Lambda and more!

It is time for some new final tuning of my small certificate signing service. In previous parts, I showed you what AWS Lambda service is and how to import simple Python application into serverless microservice. I also connected Lambda function to S3 storage service where I put certificates and key files. Then I added a trigger to the function, so Lambda function will execute automatically every time someone uploads new CSR file with certificate request to S3 bucket. Now I will show you not only how to make this function serverless but also storageless using API Gateway. It is not standard approach but in some scenarios might be interesting. So we will connect API Gateway and Lambda without S3 backend for keys and certificates.

Read More

04 Jul

AWS Lambda guide part III – Adding S3 trigger in Lambda function

AWS Lambda Tutorial, I will show you how to create or import your Python application to Lambda, use S3 bucket, add S3 trigger for Lambda and more!

This is third part of the tutorial of AWS Lambda. In previous chapters I presented my small Python app I created for signing certificate requests and imported it to AWS Lambda service (check AWS Lambda guide part I – Import your Python application to Lambda). Then I modified the code so instead of using reference to static local files we can read and write to S3 bucket (check AWS Lambda guide part II – Access to S3 service from Lambda function). Now let’s move forward and add S3 trigger in Lambda function.

We can always execute Lambda function manually either from web panel or using CLI. We can also execute it from our other application if required. But microservices are often triggered by events. In this article I will show you how to automatically sign certificate using my Lambda function when request file is uploaded to S3 bucket. Let me show you how to program S3 trigger in Lambda.

Read More

26 Jun

AWS Lambda guide part II – Access to S3 service from Lambda function

AWS Lambda Tutorial, I will show you how to create or import your Python application to Lambda, use S3 bucket, add S3 trigger for Lambda and more!

In previous chapter I talked a little what is AWS Lambda and idea behind serverless computing. Furthermore I presented small Python application I wrote to sign certificate requests using my CA authority certificate (how to create such you can find in my post How to act as your own local CA and sign certificate request from ASA). Then after importing the sandboxed Python environment (required because of non-standard library used for SSL, whole procedure is described in my post How to create Python sandbox archive for AWS Lambda) and small change in the code we managed to execute it in Lambda. Also I mentioned that we can use other AWS services in our code, in example Access to S3 service from Lambda.

As you remember the initial version of my application have static paths to all files and assume that it can open it from folders on local hard drive. If you run function in Lambda you need a place where you can store files. This place is AWS S3. In this chapter I show you how to use S3 service in function on Lambda. We will use boto3 library that you can locally install on your computer using pip.

Read More

09 Jun

How to create Python sandbox archive for AWS Lambda

AWS Lambda and Python

AWS Lambda contain now 1067 Python libraries that we can use in our programs. The number is big and small at the same time. It should give us flexibility in writing apps but same time is limitation – there are many non-standard libraries that are better replacement for default ones. I will show you how to create Python application sandbox and then ZIP archive for AWS Lambda that will contain libraries not available by default so you can use them in your serverless application.

Using this application I’ve generated list of available libraries for Python 2.7 and you can check the list here.

Serverless applications idea is that we don’t have access to operating system. We just run our code in own sandbox. Therefor we can’t just install new package if we miss it. Solution is providing ZIP archive with code of our application and python environment that have all non-standard libraries inside. Let me show you how to do this.

Read More