Monday, February 24, 2020

Testing Tour Stop #10 : Pairing up on Observability with Abby Bangser

I had my 10th session already on my Testing Tour and it was with awesome Abby Bangser. I have a very special thing to share about what Abby has helped me with. Back in 2018 when I got interested in public speaking for the first time and dint knew how to start or from where. I came across  TechVoices (formerly Speakeasy) and straightaway approached them. I had to pick a mentor from the list, so I picked one and mentioned that I am a huge follower of Angie Jones and got influenced to get into public speaking after watching her talks. I didn't expect anything by saying that, I just wanted to share. Within a week I got an email from Abby that Angie is going to be my mentor and I was on top of the world. I can never thank enough to Abby for this help.

I was following all the tweets and information Abby would share about observability which slowly caught my interest to know what it is. I started to read about it from different articles and going through honeycomb resources. Have also been following Charity Majors tweets and blog. I reached out Abby and she agreed to join my Testing Tour journey, I was super happy Yayyy!!. 


Abby suggested if we could have a 15 mins prep call to plan our Testing Tour session which was an awesome idea. We had this prep call where I shared about what I'm interested to learn about and Abby gave me 2 options from which I picked to build the app which Abby used at Agile Testing Days Germany - 2019 workshop. Abby sent me all the details about what I had to prepare before our actual pairing session. I followed all the details and instructions and a lot of DM's where Abby was super super helpful. 

Getting Ready: 


1) Create a free account on Azure

The first step was to sign up and get an account on Azure so I can create my own computer/virtual machine - https://azure.microsoft.com/

2) Create a Virtual Machine in Azure

Now that I had my account, my next step was to create a virtual machine in the Azure cloud. The goal was to create a virtual mchine in Azure cloud that is big enough to run the application on it. The reason this is important is that the application runs both a lot of microservices (~8 services and a database) as well as all the infrastructure to make it observable (6 full applications and a bunch of tooling). Abby suggested the one that's mentioned in the document ubuntu computer with at least 16GiB of RAM.

First I created SSH key by using CloudShell Bash and saved it. Then I followed the instructions from this link which Abby shared. I added all the details required and selected "Standard D4s v3 (4 vcpus, 16 GiB memory)" .




3) Connect to the virtual machine

Now that I got my virtual machine created successfully, the next step was to start and connect it. When 'Start' button as shown in the above image, the virtual machine gets started. Then I clicked on 'Connect' so I got 3 different options - RDP, SSH, BASTION. I selected SSH as an option to connect and copied the ssh command and pasted in Bash terminal. I had a few challenges here while using ssh keys and using it to connect to VM. Finally, after a few challenges, I got it connected.

4) Installing and running the application

I followed this link for the instructions for installing and running the application. I just wanted to try following a couple of commands as part of pre-prep for the session and dint want to run all the commands as I wanted to do it during the session. First, I executed the command sudo -i. which means "login as sudo" so I can run as many commands as I can until I exit that login else I have to add "sudo" ahead of each and every command. I really like the way Abby explains things by making it so simple to understand. This was the first time I came across this command. I  need to have admin rights to the virtual machine I created in the cloud so I can install the application. I just tried one command which I had no clue what it's doing -

# install docker
apt-get update
apt-get install -y \
    apt-transport-https \
    ca-certificates \
    curl \
    gnupg-agent \
    software-properties-common
  
  And the rest I left for our session. 

Session


Finally, the wait was over and it was showtime. This is exactly how I had my feelings towards this session. We started off our session with great planning and with a lot of enthusiasm. Abby already knew that I did run a few commands so the first thing she wanted me to do was to check what packages have I already installed. And the command Abby gave me for doing this was

apt list --installed
 This command gets you the list of installed packages. The output we got by running the above command was a huge list so Abby gave me another command to run.

apt list --installed | wc -l

This command is used to display all the files and folders present in the directory when it is piped with wc command with -l option it displays the count of all the files and folders present in current directory.  Wow, I was learning about commands and not just typing in and seeing the results. Abby was explaining what each command is doing which was amazing. While doing this Abby mentioned yet another new term that we are Yak Shaving. I was curious to find what this is and I found that -

Yak shaving is what you are doing when you're doing some stupid, fiddly little task that bears no obvious relationship to what you're supposed to be working on, but yet a chain of twelve causal relations links what you're doing to the original meta-task.


We were trying to run different commands and Abby was explaining me in detail so I think we were Yak Shaving which was good as I was able to understand each command.So next we wanted to search if transport-https was already installed by using the command:

apt list --installed | grep transport-https
Grep is a linux/unix command-line tool used to search for a string of characters in a specified file. The grep command is handy when searching through a massive log file. And if we want to run multiline command we need to '\'. Then we started to install docker which was part of instructions for Abby's application -
curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add -
add-apt-repository \
   "deb [arch=amd64] https://download.docker.com/linux/ubuntu \
   $(lsb_release -cs) \
   stable"
apt-get update
apt-get install -y docker-ce docker-ce-cli containerd.io
So here Abby explained to me that 'Curl' command says it's safe to download and tell apt that we need to have access to the repo. The way Abby was explaining about what each command is and what it does was so super helpful and interesting way to learn as I was not just copying and pasting the commands but was understanding the context of each command. The above command downloads the docker. What do you expect when two testers are trying to do this, of course, we wanted to confirm if the docker was installed so we checked it by running docker command. Now next step was to download docker compose -
curl -L "https://github.com/docker/compose/releases/download/1.24.1/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose

So time again to test if it was installed by navigating to usr/local/bin and check if we can see docker-compose installed. We used the command -
 ls /usr/local/bin
Now next step was to create a userm, create a folder and pass the ssh key so we can clone the repo into that folder.
# create user
useradd -m -G docker -s /bin/bash olly
echo "olly ALL=(ALL:ALL) NOPASSWD:ALL" >> /etc/sudoers

# add ssh pub key
runuser -l olly -c 'mkdir /home/olly/.ssh'
runuser -l olly -c 'echo "" > /home/olly/.ssh/authorized_keys' 

# checkout repo
runuser -l olly -c 'git clone https://github.com/feature-creeps/observability-workshop.git $HOME/observability-workshop'
Finally, we had our repository cloned and the next step was to run the stack by using the following command -
runuser -l olly -c '$HOME/observability-workshop/start-stack-in-level.sh 9'

It took quite a while to run the stack and it was already 11 pm for both of us as we both were located in London. We finally got the entire application up and running on my VM on azure clouds which was such an awesome feeling. I was able to access the frontend and all the other tools like - Grafana, Kibana, Prometheus and Zipkin. Now that I got this application running, I could use it and explore to try different levels of observability.

Learnings

  • Getting out of comfort zone and built my own VM on azure clouds with all the help from Abby.
  • Learned new Linux commands and tried practically each of them.
  • Used all new tools for the first time but instead of fear of unknowns, I was thoroughly enjoying trying something new.
  • Planning and pairing is so powerful that so much could be achieved in just a couple of hours.
  • Testing mindset could be applied anywhere and everywhere like how we used while running different commands and testing them too. 
This is just a short summary of my session, but the actual session was even more fun and a lot more hands-on stuff that I did during the session. It was really an amazing session and I thoroughly enjoyed and in fact, I didn't stop my learnings from this session, I continued learning more different commands, playing around with VM, creating more new VM's (as Abby mentioned to me that its a great way to build some muscle memory) and using the application to learn more about using different logging and metrics tool. It's just the beginning for me and hoping to keep working on this and learn more on this topic. 

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.