Learning Linux on my own.

Jul 9, 2016 · 48 min read

I always wanted to learn the commands. And ‘typing without looking at the key board’ was a fantasy. And then life happened. Over the course of last 25 years, I got better at key board (despite touch screens:-), but technology served on a platter, made me a mindless info-consumer. Not to blame Microsoft or Apple, but they have made consumption too easy with click graphical interfaces and now with touch devices even mouse is nearing extinction. In a way its like fast food. More calories at lower cost with less effort. Easy information with little or no cost to brain makes us dumb. The unspoken first lesson to interface developers these days is to assume that users are dumb, they won’t even read a ‘readme’ file. Isn’t that a shame ? I think removing these shiny interfaces and getting back to command line is as much difficult as shedding those extra forty pounds.

But it needs to be done ..

The plan is to start with Terminal, learning one command everyday and publish here as an open journal entry. Do just a tiny bit daily. In no particular order. No subscription to any organized course.Try not to digress though I know I will :-)


I am using an old laptop with Peppermint OS on it. Installation is very simple and step by step instructions are available here. I chose Peppermint6 for its speed and limited demand on hardware resources. As expected, it runs speed on this 07 Dell.


I gave myself good time to explore desktop before jumping into daily drill on Terminal. One of the big plus for Linux these days is the number (and quality) of open source applications. The days of ‘Windows first’ are long gone. Most innovations are first released on Linux and then (dumbed down versions) adapted to Mac or Windows. And the best part is open source community has made it extremely simple to manage applications.

For example this software manager on Peppermint lets us install (or remove applications) in addition to managing regular updates. In essence, if I use Linux just for work or fun, its as friendly as Mac or Windows . Though my goal is to understand the innards of this amazing system ; mostly driven from command line and light weight utilities.

For a total newbie, Linux has something called ‘Terminal Emulator’ that runs a text based interface ‘Shell’. Its something like a command prompt in Windows world but lot more powerful. We can get to Terminal through menu or keyboard shortcut <Ctrl> <Alt> t

Okay . The set up is complete and time to start the daily drill

July 1, 2016-tar

I had to start somewhere. And there are ton of lessons on Internet. I googled and opened the first link that came my way . 50 most frequently used Linux commands and started on the first command ‘tar’. After putting in about 30 minutes of iterative reading and experimenting in Terminal, I think I have a good grasp.

This command deals with archives. Tar stands for ‘taking archive’. For example , if we have three files sitting in different folders of a file system, we can combine them all into one archive and send the archive to someone in one shot. This is how most of the source code is shipped in Linux world before. They call it tar balls :-)

As I understand , archives’ primary objective is to organize files in a structure though there is an add on value in compressing the archives for easy distribution. We can use different programs for compression. For example gZip compression is one option.

July 2, 2016-cat

It seems I jumped into more useful commands yesterday without realizing that ‘more useful’ commands are also more complex :-) And I didn’t cover the basic commands at all. Course correction is needed right off the bat :-)

I know some basic stuff such as change directory (cd ), List segments (ls) etc from my school days but that’s very limited in terms of working efficiently in Terminal. So I decided to step back and first complete the basic stuff. That led me to Google search Basic Linux Commands . And here we are

The first command I took up (cat) was not at all simple but then I just wanted to follow the list on the link above. As I delved deep , this turned out to be a wonderful command. If only I was exposed to ‘cat’ earlier , writing could have been lot easier (and fun).

The name is a misnomer as it does lot more than mere concatenation. We can actually create files, display them and of course concatenate them. Btw, Its all text files here. I tried opening a word document using cat and generated lots of garbage and the CPU went to 100% for a long time (:-

Another realization — its almost impossible to learn one command at a time. We are always get exposed to additional things. Which is a good thing but it sucks in more time. For example, today, In addition to cat , I learned the concept of redirecting the outputs using > , scrolling one page at a time if the output is longer than a terminal window by piping ‘less’

Most of the today’s text was written using a cat command directly in terminal. Its so amazing to be able to capture endless text without having to use an editor or a messy word document. Goes without saying, cat is has no editing features. I am sure I will find better and easier tools as I go along. Linfo has a great page on cat (like most other commands)

July 03, 2016- Basics

Now strictly basic stuff first. ‘pwd’ tells the present working directory. ‘ls’ is a way to display the contents of current directory. ls-al shows all the files including hidden ones. Typically hidden files start with a period ‘.’ Changing directories ‘cd’ is probably the most common terminal command. cd .. always takes us one step up to the parent directory. cd /home takes to the home directory where all users branch out into their respective directories. cd ~ jumps to the current user directory (one level below the home)

One cool thing I learned today is we can start pretty much any application from terminal. For example simply writing ‘gedit’ starts a new window of editor. Writing ‘firefox’ initiates a new window of browser. Typing <ctrl> c closes these programs .

Typing name o f an app in Terminal opens the app. Press ctrl c to close the app

July 04, 2016- Terminal.

Happy Independence Day

I was wondering why am I getting so interested in Terminal. Is there a underlying trend ? I think yes. Command Line is getting popular again. I think it started with Twitter where # (hash) and $ (cash) tags brought a certain degree of geekdom to a social platform. Google search bar can do lots of basic things such as math operations or unit conversions in addition to its powerful search. Trend continued with Slack who reincarnated lots of commands and concepts of IRC. And finally Facebook Messenger and Apple iMessage are opening to Bots and Apps to add another layer of sophistication. Does it mean graphical interface is going away ? No , I don’t think so, but as computers become omnipresent, users are looking for an easier way to interact with machines. These could be AI / Voice interfaces or a VR environments. Many user may want to go back to terminals for it’s speed, maturity and distraction free environment.

Lets get back to today’s command . I explored ‘mv’ today. Its a very simple command to move files (and directories). Tilda ( ~ ) and .. work with ‘mv’ just as they did with ‘cd’. ‘mv’ can also be used to rename the files. We can move a file into another file. In fact there is no dedicated command to rename the files.

July 5, 2016- Basics

Worked on multiple things today. Started the day with the simple commands ‘cp’ and ‘rm’. Basically copy and remove files (and directories). Word of caution on ‘rm’. If we remove a file using ‘rm’, its gone for ever. It doesn’t go and sit in ‘Trash’. There are some ways to recover files but they work in special circumstances and are too complicated at this stage.

Then I moved to exploring the email clients that work with in the terminal. Bit of exploration landed me to Mutt. Started at Tutorial by Andrew . http://www.andrews-corner.org/mutt.html

During the process of installing / configuring Mutt , I figured that ‘fetchmail’ and an editor (pref vi) are pre-reqs. Installed both and followed the complete tutorial but there are still some issues with ‘procmail’. Will work on them tomorrow.

Later in night explored ‘v’i . Read comparisons of various other editors . Writing this on gedit. Also installed ‘emacs’. Sent a note to Andrew seeking help on Mutt. Tomorrow will try to close the email work and focus on editors unless something more exciting catches up on me.

July 06, 2016-mail and browsing

Andrew responded to my query on Mutt. It was a mistake on my part; I had not installed ‘procmail’. As I did the installation, mail started downloading. It took me a while to understand the ‘vi’ interface to reply to the test email. Reply however didn’t go thru. Sent another note to Andrew. He very kindly responded again to my newbie questions. This time I was missing another program ‘msmtp’ (:- Everything worked great as I installed msmtp but there is still a lot to explore on Mutt interface

The great thing about Linux is the community and the way people respond to one on one help requests. I didn’t know Andrew at all. I just read his tutorial and sent him a note when I couldn’t get through. He responded same day in language full of support and friendliness.

Encouraged with my success on Mutt , thought of installing a text / terminal based browser :‘elinks’. Starting Chromium or Firefox is not only a hassle, these graphical browsers have too many distractions. I learned that ‘emacs’ had an in-built browser but I am keeping ‘emacs’ for tomorrow. For today lets just play with elinks.

Installation was a breeze. It starts up in no time from Terminal itself. The best experience of reading was Wikipedia where I ended up reading the whole history of GNU project , Linux and many more things. Reading through Google News or Finance may take some more time. Twitter was bad. Amazon is worthless without product pictures. I tried to Google elinks friendly websites but to my surprise no one has cataloged or ranked websites based on their text browser compatibility. May be this will be my next project. The menu driven elinks is fun.

July 7 , 2016-emacs

The day spent on exploring ‘emacs’. Its a powerful and fun way to write. Very intuitive way to use keyboard to meet pretty much all the word processing needs, I could imagine. I barely scratched the surface with first tutorial on ‘emacs’ and as a practice session , I am writing today’s journal on ‘emacs’.

Today being the 7th day from start of this journey, I think I should summarize the week’s progress and publish this post.

So here is what I have accomplished in first week of my Linux journey

  • Ran through basic commands such as pwd, ls, cd, cat, tar, less, more, mv and rm
  • Installed ‘Mutt’ and tested terminal based email
  • Installed and explored ‘vi’ editor
  • Installed and explored ‘elinks’ text based browser
  • Installed and explored ‘emacs’ editor.


Started the week learning IRC. Given that IRC is tightly knit into open source community, I think its important to explore this messaging system.

Here is the link I used to guide me thru the process . Tim Watson has broken it down to very basics for a new user. I went ahead and registered my nick name and a new channel #smbstack on Foonetic network. Welcome to drop by to say a hi, old-school style :-)

Btw, for those on the go there is an iPhone app for IRC as well. I found it very functional.

The best parallel to IRC in today’s world is #Slack. No wonder its catching up fast in enterprise world. Slack runs with similar concept of channels and ton of IRC commands have found their way into Slack chat. For example /join channelname works in both.

Despite being around from 1988 (before PC revolution) and numerous free messaging apps with user friendly interface, its surprising that IRC has survived. In fact one of the leading network Freenode is growing in its usage . Reference 2012 article on pingdom.

July 12, 2016- Command line History

Explored command line history today. One of the reason Terminal’s popularity is that it lets me search all the previous commands I typed in. A simple press of <Ctrl R> , opens something called ‘reverse i search’. Now any string I type, shows me the last command I used with that string. Of course I can print the complete history of commands on terminal with ‘$history | more’ . A quick print of my command history showed that I have run 507 commands in last weeks. Mostly cd or ls :-)

Of course I can scroll up or down one command at a time using up or down arrow.

We can also run instantly say the 10th command in reverse history using ‘!-10’. Complete text of command history is available at this wonderful post.

The benefits of command line history are many

  • Makes it easy to run long complex commands from past day’s or week’s work.
  • We can get a command from history and then modify it to meet our current focus. In fact once invested some time in Terminal , I might never write a full complex command again.
  • System also records the time stamps of previous commands . Its very useful in audit and accountability.
  • This is an inherent advantage over the graphical interface as there is no way to search my mouse movements. I can sure undo some commands but its very limited functionality compared to command line history in Terminal.
  • We are searching history across multiple applications and system operations. In essence the whole operating system turns into one big application.

July 13 , 2016 -Finding my way to the USB drive thru Terminal.

Peppermint7 was recently launched and I am still running version 6 on my Dell laptop. The one I am using for this post and exploring Linux. I ran through a quick comparison and I think its time to upgrade to 7. I have a big Desktop running Windows 10 that I am rarely using as most of my work is now centered around this Peppermint machine. May be, I can rid off Windows for good !

I downloaded the 7 through Torrent. Peppermint comes with ‘Transmission’ a native Torrent app. The 1.2 GB image didn’t take much time. I downloaded ‘Unetbootin’ to burn the image on a USB stick.

Before I burn the image , I thought of formatting the the USB and why not use the command line for it ? That’s how I got into exploring the commands around volumes , formatting features and file system management.

In Peppermint6 desktop (lxde), the file system ‘Nemo’ manages the files in a Windows like graphical user interface. Which means that the moment we insert the USB stick, ‘Nemo’ shows it in the file system and we can click on it to see the contents, or right click on it to see utilities such as format or properties. In older days this was not the case. Each media device was mounted on a specific file system location (directory). This means we can mount a USB device in say documents folder or any other working directory. And device becomes seamless with file system. I think to make it easy for ‘mouse and click’ people (like me); now a days , the devices (USB sticks) are automatically mounted in the ‘/media’ folder. So if I want to browse the content of the USB stick , just ‘cd /media’ and I should be able to ‘ls’ the contents. No need to ‘mount’ the pen drive. However ‘mount’ command is very useful as it lets us see all the system devices and their respective locations

The first line of the Terminal screen-shot above shows that ‘/dev/sda1’ is mounted at ‘/’ and is of type ext4 file system. This is the main hard drive. (Ext4 is a file system like FAT32 or NTFS. I will explore more on file systems later) . The last but one line on terminal window shows that sdb1 is mounted on ‘/media’ . This is the USB device (also shown in the Nemo window on right). This automatically got mounted in /media. So now we know how to find the USBs thru ‘cd’ commands.

July 14, 2016- Finding the space on USB and other drives

Continued my exploration of the USB media. Now that we know how to find the way to the USB, next question is what is the simplest way to check the space available on the storage devices. The command ‘df’ is very handy. Rather than opening the Nemo window and looking at the properties of USB (by right clicking) , its lot easier (and faster response ) to write the just ‘df’ on the Terminal screen. Here is a comparison screen-shot

The ‘df’ command output is bit cryptic to read first time, but that’s a little learning investment for speed and agility of the rest of life :-)

July 16, 2016- Installing an Ubuntu Desktop.

My plan to build my desktop with Peppermint 7 took a hit because, I couldn’t boot my desktop through USB stick. I spent hours researching the BIOS of my Asus motherboard . Even sent them a tweet but nobody responded (:-

Luckily my old Dell laptop has a CD writer. I guess it was last used five years back. Luckily I have re-writable CDs somewhere, it was only a matter of checking with my wife. She is a savior when it comes to finding forgotten things :-). As I was getting ready to burn the CD, the next problem showed up. The max I could write was 700 Mb whereas the Peppermint 7 image is running into 1.1 gigs. So change in plan ; I googled for the smallest Linux Disto available to burn on CD. There are many. The quest finally took me the minimal CD installation

The minimal CD is just a 48 Mb image that helps boot the PC and then uses the Internet connection to download rest of the image . It was a no-brainer to go for minimal CD as I didn’t want to download another big file.

Burning a CD after a long time, turned out to be a fun project. On Peppermint I found ‘Brasero’ best reviewed media burner software. Installation took hardly anytime and I had the the minimal CD ready in 15 minutes. I did think of exploring command line options for burning the CD but I had already spent too much time on this mini project.

I stuck the CD in my Desktop , took a last good look on Windows 10 :-) and rebooted the machine. Installation started smooth , it picked up my home network and managed to run with very little manual inputs. I could have gone for a dual boot option but , I decided to wipe the system clean . If I need Windows later , I would rather install on a virtual machine on Ubuntu. Overall installation took around two hours and here , I have a 500 Gig Desktop with 4 GB RAM running Ubuntu desktop.

July 18,2016- Unzipping the files and handling spaces in file names.

Despite all the increase in network bandwidth and storage capacity of modern computer, we still send and receive zip archives through emails. Which is not a bad thing as it can lay out the whole directory structure. Zip archives can be very easily unzipped from terminal. The command is simple ‘unzip <filename>. We can also provide a destination. Of course this can be easily done on the graphical file system as well but again , terminal has a plus on speed and if we live in Terminal then starting another interface is taxing.

Many times the files have names with spaces e.g. ‘Dashboard Screenshots.zip’ . These files can be handled using a simple string to let terminal know that its just one files. This is applicable across all the commands at-least the ones I have explored to date such as ‘cd’ , ‘mv’, ‘cp’, and ‘unzip’. So if we want to unzip the the archive said above , simple command goes as below

In the screen shot above, I tried to first unzip the archive without using strings. Terminal thought that these are three different files. I then filled the space with % sign but that is more of Internet thing to replace spaces with % signs. It didn’t work in terminal. Then I used the string to pass one file name and it worked. Since I didn’t specify a destination directory , it automatically created a directory in current location and extracted the files there. Even the new folder had a space in the name. I used the ‘cd’ commands with strings to navigate to the new folder and listed the files in there.

The reason bash doesn’t ignore spaces is it tries to find some other option or argument after a space. Another way to have bash ignore space is to insert ‘/’ immediately before space. For example I could have used $ unzip Dashboard/ Screens.zip . This appears a bit counter intuitive at first as it gives an impression that Screens is a file under Dashboard directory though the / here is a mechanism to let bash ignore a space.

Btw I am using the Ubuntu Desktop today for my work and for this post. Not missing windows 10 at all. It is amazingly fast given that the hardware specs on this Desktop are better than those on my old Dell. The desktop interface on Ubuntu (unity), however, is very different from what I used in Peppermint 6. It looks a lot like a Mac. Peppermint 6 (lxde)was more of Windows XP feel. There is a bit of learning and getting used to. I think I need to invest a day to research different Desktop interfaces offered by Linux Distributions. May be tomorrow ..

July 19,2016- Desktops (DEs)

Saying that its mind-boggling is an understatement. Linux has close to three hundred different distributions and at least ten major different desktop environments. So much so that there is a on-line place Distrowatch.com for news and to track these distributions. If we mix and match each distro with compatible desktops we are looking at thousands of possible combinations. And new distros are adding almost on daily basis. Now a days, the trend is to target a very specific user base for a distro. For example ‘Black Arch’ is targeting testing and security research with over 1500 relevant tools built in. There is project for porting Android to x-86 platforms and its working well.

A good look at leading desktop environments is here on Linux.com

I, so far, explored only two desktops.

I used ‘lxde’ as it is a default DE with Peppermint6. ‘lxde’ is a light weight desktop. It lends the speed and suitability with older hardware ; the big focus for PeppermintOS. For me it was a good starting point as my focus was to learn the command line rather than marvel on the desktop features. I found the interface least intrusive but very effective. It helped me focus on command line ensuring ready assistance whenever I felt the Terminal overwhelming for my chores.

With installation of Ubuntu 16.04 LTS Xenial Xerus (yes that’s the name), I experienced their latest Gnome3 shell ‘Unity’. Its an extensive desktop emulating lots of Mac-ish features. The launcher acts more like a Mac Dock . The windows are controlled from left top and the file menu is always hinged to the top of desktop. They have something called ‘Dash’ to search the apps (and more) similar to ‘Spotlight’ on Mac. Mac World published a very good article comparing Unity with El Capitan , earlier this year.

There are a few ways where Ubuntu is ahead of Apple. We particularly like the keyboard-based menu system. In any app you can press Alt and type in the name of the menu option you want, then pick it from a list. It’s faster than switching to the mouse or track-pad.

Generally speaking, applications are intimately blended with respective desktop. Each Desktop has its own preferred core applications. In a sense, now desktop acts as a platform rather than the Linux kernel. So if we are coming to Linux for a set of applications (which is not a normal case), we need to choose right desktop. Which means we need to look for what file system navigator is integrated into the desktop , what editor, which mail client or system monitor. Changing default applications is easy but is an added task. Changing the whole desktops from default to something else is technically possible but not advisable. This is another reason that Terminal is a very useful. Except for few minor differences , Terminals commands work similar across all the distributions.

July 22, 2016- Do we really need Desktops?

While desktops are great the question is at what cost ? I checked the computing resources Unity (on Ubuntu 16.04) is consuming without any other application running. Its almost one GB of RAM. For a 32 bit computer where maximum RAM is 4 GB , its almost 25 % of most precious resource . The screen-shot below shows the steady state memory consumption Unity interface with only System Monitor running.

The question is what value this interface is adding ? Yes , there is a nice dock , dash , file manager, software manager and widgets like time, network etc but is it worth investing 25 % of RAM? On the other hand lxde (on Ubuntu 14.4 aka Peppermint6) consumes only 150 Mbs standalone but that’s still a significant amount.

If all these features are easily available in the terminal, is there a way to bypass the interface altogether ? And still be able to use graphical applications such as browser when needed ? One argument is; Why worry too much about resources when they are so cheap anyways? We can always upgrade to a faster chip and more memory but to me it appears as if I should strengthen my calves because I need to carry my overweight body. Why not cut the weight down and strengthen my muscles at the same time ?

Little research revealed that Linux allows us to switch to a complete text mode by pressing Ctrl Alt F1 (and switch back to graphical mode with Ctrl Alt F7) . Quote from stack exchange forum

Linux has by default 6 text terminals and 1 graphical terminal. You can switch between these terminals by pressing Ctrl + Alt + Fn. Replace n with 1–7. F7 would take you to graphical mode only if it booted into run level 5 or you have started X using startx command; otherwise, it will just show a blank screen on F7.

‘I tried Ctrl Alt F1, was hoping for a Terminal window similar to an emulator but it turned out lot less functional. And I also understood why the one in graphical interface called a Terminal Emulator because its emulating the console. The problem with pure text mode , is it doesn’t allow any desktop based applications . For example , I couldn’t start the Firefox or even gedit elinks and vi worked fine. Looks like there are applications that were written for console vs the newer applications that are part of the desktop experience. I will continue my search for more applications that can be used without graphical interface.

July 26, 2016- Ubuntu Mate

On my quest to find the fully functional user interface for a newbie (like me) and optimum consumption of system resources , I installed Ubuntu Mate. The good thing about minimum CD image is it allows many options for Ubuntu variants such as Lubuntu, Kubuntu, Ubuntu Server and Ubuntu Mate (to name a few). Based on multiple reviews , I chose Ubuntu Mate as its modern, customizable and full featured and promises much less load on system (v/s Unity) . And that proved quite right as in the graphic below

Mate is a pleasant experience to say the least. People compare the interface with Windows 7 but I found it lot more. First the learning curve for an average windows user is negligible. I went for a minimal implementation of Mate which doesn’t come with any per-installed software. I installed Chromium browser because I need Ebsta plug-in for Salesforce thats only available on Chrome (or Chromium).

One key missing piece was Ubuntu Software Center. Mate has its own implementation of Software Center called ‘Ubuntu Mate Welcome’. Installation is simple Terminal command below.

$ sudo apt-get install ubuntu-mate-welcome software-center

I found ‘Mate Welcome’ lot cleaner minimalistic interface then Ubuntu Software Center. Both these graphical interfaces are great place for new entrants to explore software on Linux.

For now exploring the beauty of mate :-)

July 26,2016- Installing software from command line- slick and fast

Its been a while since I explored a new command . During the course of exploring desktops and installing software , I did pick up few new things such as ‘sudo’ and ‘apt-get’

‘sudo’ stands for ‘Super User Do’ . Basically , its a way to authenticate user with password. Whenever we install Ubuntu , it always creates a user for us. The root access is never given to a new user to avoid accidental mess ups. However, certain tasks such as installing new software can be performed only if we provide our password. ‘sudo’ is a way to tell the system that installation is authorized by me. In nutshell , my user has more power if I provide my password. So no one else can install software (or other admin tasks) on my machine unless they have access to my password. The concept is very similar to Mac. In other distributions , we can switch to root with command ‘su’.

APT stands for Ubuntu’s Advanced Packaging Tool. With ‘apt-get’ we can download software packages and ‘apt-cache’ allows us to search the repository. ‘apt-get install’ needs root permissions so we need to pefix with sudo.

July 29 ,2016 — Geary

Looks like I have found my first killer graphical App for Linux. And yes , its a good old email client with a new paradigm. I am now using Geary for last four days. I started with one of my less used account on on my personal Google Apps domain (smbstack.com) and as I figured out the navigation and over all stability of application, I decided to go full with my main Gmail account . Goes without saying , it has two step authentication; Holds more than 100 K messages and active all the time. I had to create an application specific password which is easy. Thanks Google.

Why I love Geary and what is the new paradigm of email.

Here and Now
There used to be a time when email client on the computer was a hold all place for email and web access was ‘once in a while’ thing. Reasons being poor bandwidth, meager on-line storage. Things have changed with ubiquitous availability of the Internet and practically unlimited storage with most service providers starting with Gmail. Which means now on-line is the main storage (and research) space and client should only pivot at recent. Most of the mobile email clients understood this shift but heavy weight desktop Apps ( Outlook, Thunderbird) are really designed for big archives of emails. Though Geary can handle lots and lots of emails , but the beauty of this client is its light weight designed to focus on ‘here and now’.

Email client is NOT dead
With easy on-line access, many designers questioned the very purpose of a native email client. That too on a wired desktop ! I can probably do lot more with Gmail in browser but it somehow doesn’t feel real. Remember the mistake Blackberry made to ship their first tablet without an email client. It just didn’t work on mobile devices and Desktops are no different. The speed , the interface and control on a native application far outweighs the browser interface and of course browsers consume lot more resources than a thin email client. And the fact is all of us have multiple accounts. At least one from work and one personal. An Apple , Google, Microsoft and ISP ID is always forced on us with respective email accounts :-) And if you are like me managing multiple domains there is every chance you have at least ten different email accounts. Accessing them daily on a respective browser window is a pain. Geary (like any other client) gets all our email in one place and does it in a frictionless manner.

One good lesson to learn from Apple is ‘minimum design’ evokes creativity. It helps improve our productivity and creates a natural pull back to application. Geary does that. For example signatures are part of of the ‘account settings’. Down arrow take me to the next message. Shortcuts are meaningfully abundant but not a mental math challenge. Conversations are a default view and very intuitively shrink to get me to the latest message on a thread. Labels are built into the core design. And many more.

The unfortunate part is Yorba Foundation , the people behind this beautiful open source project closed down because of lack of funding. But like most open source software, there is always someone to catch the baton and keep on.

There is a bug in Geary that make typing lag when composing a new message. A work around is to disable iBus daemon.

July 30,2016- Shells

Just so that I am clear on definitions , a Terminal is a window that is a text interface to a computer. Terminal can use different programs to render the interface. These programs are called ‘Shells’. Ubuntu uses ‘bash’ as its default shell though we can install different shells such as ‘ksh’ or ‘tcsh’. I am not very clear which program renders the terminal window itself. A terminal window without any graphical Desktop interface is called a ‘console’. Even if we have installed a Desktop (eg Unity or Mate) , we can open a Linux in console mode with simple ‘Ctrl Alt F1. In fact we can open up to six sessions of console going with ‘Ctrl Alt F6’. If we want to get back to graphical interface we need to do a ‘Ctrl Alt F7.

In the Desktop graphical interface (Ctrl Alt F7), if I want to run a terminal like interface, I can do that with ‘Ctrl Alt T’ . Its called a terminal emulator for simple reason that its emulating the terminal within the graphical interface. A newbie like me will always use a terminal emulator cuz its its easy and lot more friendly than the terminal.

With in a terminal ( or terminal emulator) we are running a shell (e.g. bash). Bash has three major modes. Login mode allows us to provide our user-name and password to authenticate with system. And interactive mode allows us to key the commands in manually using a key board or simple cut and paste. A non interactive mode allows us to run the shell scripts ( a text file full of sequential commands.

The important thing is the shells can be nested. This means that we can start bash within a bash session. The new session can be used to run say a script and then exit with ‘Ctrl D’. Most of the graphical applications start their own nested shell and exit when we close the application. We can nestle different shells for example we can start ‘ksh’ with in ‘bash’ for a ‘ksh’ specific script. Goes without saying that ‘ksh’ should be already installed.

The picture above , I found the version of current bash shell installed on my system. Its bash version 4.3.46. Then I started a bash shell within already running bash. No error though I didn’t have anything to do here so I exited. Then I tried to start the ‘ksh’ but looks like its not installed. I will install it some other time and check it out.

Aug 01, 2016- Review of first month progress.

It’s been a month since I started the GNU Linux journey. The idea was to DIY Linux with the help of on-line resources. Without following any set curricula or single source of organized learning. Google , of course, is my primary friend and guide :-)

It’s time now to review all that I could accomplish in one month. To put it in perspective, I will categorize the work into five broad buckets.

- General Concepts
- Installations and Desktop Environments
- Desktop Applications
- Shell Commands
- Shell Applications

I started with the most basic of concepts as to what GNU Linux means when we see it from a desktop OS perspective. The big pillars of an OS such as boot, kernel, terminal, shell, desktop. Understood the concept of open source movement and what it means to be a free software under GPL. The unique value and culture nourished by open source movement. Various distributions and their versioning. Helpful resources , forums, and pod-casts. I didn’t document all the things that I learned progressively in this category because lots of it comes to us by association and it’s an ongoing process.

I installed Peppermint OS with lxde, Ubuntu with Unity and Ubuntu Mate. Did some research on Elementary OS as well and probably that will be my next installation if and when Loki is available stable on a 32-bit computer. As of now , I have Mate running n my Desktop and Peppermint on my laptop.

I explored a number of applications in the three desktop environments that I have used thus far. Most impressed by Geary and Shotwell both from Yorba foundation. Closely followed by the eBook reader and converter Calibre. Thunderbird is a great communication platform but it reminds me of Outlook. I liked Transmission as a simple torrent client. Corebird is a cool Twitter client but it was choking on Unity. I will give a spin on Mate. Gnome-screenshot is awesome. I installed Libre office on all the desktops but all my work goes into Google Docs. I rarely download or edit a file? Both Firefox and Chromium are amazing on. Chromium consumes more resources but it has plugins that help my work. Firefox is a breeze for everything. Gnote is a great tool for notes. It creates hyper-links (wiki style) and maintains them automatically. I started exploring GNU cash but left it in the middle. This is one tool I want to explore as a standalone business application. Dash is great on Unity. Synapse didn’t work first time on Mate. Need to spend more time on it. File managers Nemo on lxde. Caja on Mate and Nautilus on Unity are all very good. I didn’t see notice much difference probably because I don’t have too many files. The software manager on lxde is fast and functional ; Ubuntu software center is trying to do a lot like Mac app store but it will probably need few more iterations. The software boutique on Mate is visually very appealing but it doesn’t have all the titles. Goes without saying that command line is the easiest way to install software. Telegram , the new messaging application (like WhatsApp) has a nice native application for Linux. I just tested it though not much use as most my connections are on WhatsApp. While on messaging , I spent a lot of time understanding the IRC. XChat , the default IRC client on lxde is very fast and minimal resources. Chatzilla in Thunderbird, as well as Hexchat on Mate, are equally good. I also looked at the web interface for IRC for Mate community. Need to explore more on the web interface for my own IRC channel :-).

I have run through most of the basic shell commands. Change Directory (cd), create directory (mkdir), List files and directories (ls) , copy files (cp) , move files (mv) , delete files (rm) , manage archive (tar and zip) , manage text (cat), volumes management (mount and df), root authentication (sudo) , software installation (apt-get and apt-cache) . Command line history (history | more). I still need to work on tens and hundreds of options that come with each of these utilities but all that can be done with man pages.

I have explored two shell applications so far. Mutt for emails and eLinks for text-based browsing. Both are solid applications. Next few weeks, I am planning to explore more shell applications to the extent that I could run my life from the console (or terminal emulator). Some of the key applications I am looking for are News-feed reader , Stocks quotes , weather, multiple email accounts within Mutt, messaging tools (IRC , WhatApp, FB messenger , Google Hangout) , Pod-casts player, Media Player.

Aug 02, 2016- Twitter in Rainbowstream

I haven’t yet reached a stage where I could give up desktop for my work. Only way to spend more time in terminal is to make sure that I gradually switch most of my personal stuff on to emulator. So the question I asked myself this morning was what is the next application I need beyond email and browser. Turns out the Twitter is my addiction and unfortunately Twitter sucks in text based eLinks. It actually does on Firefox too (:- when compared to dedicated clients on desktop and official App on iOS. Frankly, I was not very hopeful of finding a Terminal client for Twitter for they have systematically killed even the third party desktop clients (sad). Still I googled and very pleasantly landed on Rainbow Stream .

The shiny website gave me hope and went on to install the package. But wait, I first had to install Python and PIP. Few commands very well documented here. Twitter authentication is very well designed. Much better than most desktop clients. It all went smooth and Twitter flood gates opened onto my terminal.

The first and last thing we need to know with this App is how to press ‘p’ and how to click ‘h’. First will pause the stream. Very needed for those who follow thousand plus active accounts or else tweets would zoom past the screen. The second where all the help is available. Believe me, help here is lot better than man pages. It says , quote :-

Hi boss! I am ready to serve you now :-)

It covers feature of Twitter with a good exception of ‘moments’ :-). Trends , Lists , Direct Messages, Profile Views , Followers and Friends and of course easiest possible way to tweet and re-tweet. I have tried Twitter on pretty much all the browsers , desktop and smart phone clients ; this by far the fastest and best Twitter experience to date.

So now I am good with email (Mutt) , Browser (eLinks) and Twitter (rainbowstream) . Next up News feed , Stocks and weather .. tomorrow .

Aug 03, 2016- News-feed in Newsbeuter

Despite my love for Twitter , I must admit , I still go to CNN and BBC homepage at least once a day (old habits). That said , the fact is both websites are inefficient when it comes to delivering content quickly. And there endless cajoling to make you a subscriber is annoying to put it mildly. Hence the need for a feed reader. I was a big fan of Google Reader and since Google pulled the plug , I tried subscribing to feeds in email clients and frankly nothing worked for me. Recently, ‘Feedly’ is a great source and wished something similar in the text only mode. So the quest for a fast text based newsreader led me to Newsbeuter

I read the reviews and claims that its the Mutt of news-feed but didn't really believe till I started using it. Installation was a breeze. Only problem was to build the ‘url’ file hidden somewhere in file system. That took me to bit of a moment to realize that instead of editing an existing file , i was supposed to create a totally new one and add the urls. Once done the , the feed worked magic . Fast and distraction free. In addition to CNN and BBC , I also added a feed on ‘twtr’ from Nasdaq.

The shortcuts are similar to Feedly. ‘j’ for the next and ‘k’ for the previous article (Or I should say Feedly has those similar to Newsbeuter for the later exists since 2007 ). Newsbeuter is lightening fast compared to Feedly for obvious reasons and probably uses a tenth of resources.

Aug 04, 2016- Stock Quotes

David Walsh did a nice post on getting quotes with a simple commands from yahoo finance . I modified the command to add in my portfolio . Using command line history , it’s easy to run this command whenever I want to look at stocks I am interested in . I think someone has written a shell script to do the same but I am far away from understanding the scripts. For now it works well. It shows me the current price and price change. That’s all I need to know that everything is fine:-)

The command uses ‘cURL’ to fetch the data from yahoo finance. cURL is a cross platform open source tool to transfer data. It supports most (if not all) the protocols and authentication mechanisms..

Aug 05 , 2016- Weather

More from David Walsh’s post . I am turning into a fan of his thoughtful writing. Getting weather information in command line turned out so easy and lot more fun . All I need to say is

$ curl- wttr.in

It senses my location (not sure from where) and lets me know if its T shirt or a Jacket.

Adding a /Moon@date shows the Moon in all its glory . Its going to be a full moon on 19th August .

Thanks and credit to Igor Chubin

Aug 09 , 2016- Back to Terminal basics

I am getting very comfortable with terminal. There are three key things that I learned along the way that make the Terminal very fun place

First is using ‘tab’ to complete file names. Files coming from Windows world , can have crazy long names. Typing the full name , just to copy , move or remove is a pain . And that’s where tab comes in handy . We can start with few starting characters of a file name and then press tab to have terminal automatically complete the name for us. If there are two files with similar starting names , tab will take to the point where the similarity ends and lets us type another character to pick up right file.

Second math is built right into the terminal . We don’t need to fire a calculator at all. For example $ echo $((4+9)) displays 13 on the screen. It can get complex to solve Algebra as well. Best resource I found on math operations is at Shell-Tips

Third is directing the output with > or >>. The default output for most of the commands is the Terminal screen (long gone the days when it used to be paper for computers of those days didn’t have monitors :-). For example if we run above math operation echo will send the out put to terminal screen . We can change the output to other place say a file. E.g. $ echo $((4+9)) > file4.txt will overwrite the contents of file4.txt with the output of math operation. In this case 13. If the file doesn’t exist, it will also create a file for us. Using >> the output gets appended. Goes without saying that the output is now going to file so nothing will be displayed on screen.

Fourth is piping . If we want to update the file and also show the contents on screen we can pipe ‘less’ command (The character ‘|” known as ‘pipe’ is a way to join the commands). So the command $ echo $((4+209)) >> file4.txt |less file4.txt will add 4 and 209 , append the sum (213) to file4.txt and also display file4.txt on terminal. A typical use case for piping is redirecting the out puts of commands such as ls -al to a file for sending or posting to a forum.

Aug 15,2016- Searching from the command line

grep lets us search a number of files for a string. The output is the lines of the text where the string appears and search text is colored in red. The option -i ignores the case of the string. -v inverts the search (means non matching lines). -l allows to search the file names in current directory.

apropos lets us search the man pages for a topic. Lets say we search man pages for mutt (the email client )

Each line in out put corresponds to one man page with number of occurrences of the key word (mutt) in that man page. A very easy waty to find help without leaving the terminal.

find allows us to search files and directories on a given path. It comes with numerous options and filters to narrow the search down. The search is powerful enough to find a needle in hay stack.

Goes without saying that outputs of grep and apropos can be redirected to a file using > or appended to a file using >>

Aug 16 ,2016- git

I had set up a git hub account long back using their browser interface. Never used it for I thought it was for open source enthusiasts! What good use for a common man !!! Couldn’t be more wrong.

As I started exploring the Linux applications, and with better appreciation of open source movement, I understood the value it delivers for globally spread out contributors for both version management and central repository of code and documents. I am not new to version management. Having lived most of my life in SharePoint, I know what a pain keeping tabs of all the versions is.

Little more exploration , revealed the network effect of ‘git hub’. It is to developers what Facebook is to common people. Bragging rights aside , it really pulls developers and contributors together and help bring their best out to the community.

Still it was a tool. Another web tool !

The real value became apparent to me when I started using the command line interface for git. Luckily , landed on this online manual at git-scm.com. One of the best online book I have ever came across. It looks too much but first two chapters are good enough to get us started. Following the instructions , I have installed and configured git on my system and cloned my first repository . I was also able to complete my first commit. All of it in less than one hour of reading and experimenting.

Aug 17, 2016- git continued

Installing git on Ubuntu is very simple. Like any other package its a sudo apt-get install. Configuration has three steps to let git know our user name , email address and what editor (nano , emacs or vi) we want to use. Goes without saying that we should have the editor installed upfront. I think git install automatically installed emacs for me. Going to stick with emacs as I want to explore this editor more.

We can check the configuration at any stage by calling $ git config — list. There is help available on all the topics thru $ git help <verb> . I tried config , clone and add. Worked very well . Probably too much information.

My nephew has started building a free eCommerce website for a science apparatus company in India. To help them reach out to schools and college labs. I cloned his repository with

$ git clone https://github.com/siddharthjn/nsaw

Cloning a repository , creates a directory and initiates git in that. For example , it created a directory nsaw in my / Documents (cuz thats where I cloned it ) and initiated git . It also fetched all the files in master to get me started. It didn’t fetch any of the branches yet.

There are many other ways to get started with git repositories. Lots of content available in tutorial. I could have copied the repository in some other target directory as well by providing a target place in clone command. I am just keeping it simple.

Adding new files to git for tracking is to simply switch to the project directory and say $git add filename. I started a test file just to run thru the full cycle. $git status shows a very elaborate status of all the files as to if they are modified or staged. $git commit is the command for telling git to go commit the changes as a new version. We can remove or ignore files from tracking at any time. $git push is used to publish the changes to git (central repository on git hub) . It asks for the user name and password.

Here are my two pushes to add and remove ashu_test file

Aug 22 , 2016- git branches

git allows us to create multiple branches of code base to address different parallel work streams or issues resolution. The way to see these branches in central (remote) repository is git branch -a

ashutosh@ubuntu:~/Documents/nsaw$ git branch -a
* master
remotes/origin/HEAD -> origin/master

It shows that there are three branches on remote (git hub) server. These are gh-pages , master and site. Each of these branches can have separate development objects or copies of the same objects. The work in each of these branches can have its own time line and can be merged if we chose to do so. There are many use cases possible to work with branches to address development , issue resolution and testing.

If we need to get these branches on the local system. The command is

ashutosh@ubuntu:~/Documents/nsaw$ git checkout gh-pages
Branch gh-pages set up to track remote branch gh-pages from origin.
Switched to a new branch ‘gh-pages’

This created a local branch gh-pages and set it up to track remote branch gh-pages. It also switched the git to branch gh-pages for all the operations. So now if I say git fetch , it will fetch all the folders under gh-pages. Like wise for push.

I went ahead and added branch ‘site’ also to my local directory. We can use checkout command to switch between branches.

ashutosh@ubuntu:~/Documents/nsaw$ git checkout master
Switched to branch ‘master’
Your branch is up-to-date with ‘origin/master’.

ashutosh@ubuntu:~/Documents/nsaw$ git branch -a
* master
remotes/origin/HEAD -> origin/master

The * on master shows that we are now in master branch.

Aug 25, 2016- Mid Night Commander

Linux distributions have dedicated file browsers. For example Unity has Nautilus , Mate has Caja. Both these are very fast and friendly, nothing comes close to Mid Night Commander (mc). Built right in the shell, mc is very useful when I need to read the log files. Like most desktop file browsers, mc has two pans. The good thing is, we can configure each pan thru the menu. I have been using it for last two weeks. Keeping my left pan as the file tree and right pan to show the contents of the files.

Very useful for viewing the system logs as I don’t need to open the logs thru an editor. Being new to Linux , many times its hard for me to determine which is the correct log file. mc lays it out there and allows quick switching between the files. Mouse support is handy in emulator as well as console.

Sep 6, 2016- Puppy Linux

The peppermint 6 on my Dell laptop was running okay but it started struggling as I started using Chromium of Libre Office. Not that I use it on daily basis, still it felt like changing it to something lighter and different. As I was looking for really sleek distros, I landed on Puppy Linux website. As I read thru the concepts and design principals, I wanted to give it a shot. Here are the goals of Puppy Linux ((adapted from Barry Kauler))

  • Puppy will easily install to USB, Zip, hard drive or other storage media.
  • Booting from CD (or DVD), Puppy can load itself totally into RAM so that the CD (DVD) drive is then free for other purposes.
  • Booting from DVD (or CD), Puppy can save all work to the DVD (CD).
  • Booting from USB flash drive (or other flash media), Puppy will minimize writes to extend its life.
  • Puppy will be extremely friendly for Linux newbies.
  • Puppy will boot up and run extraordinarily fast.
  • Puppy will have all the applications needed for daily use.
  • Puppy will just work, no hassles.
  • Puppy will breathe new life into old PCs.
  • Puppy will load and run totally in RAM for diskless thin stations

These are amazing goals , I wanted to check how much of this is really achieved. I started with the LTS release of Puppy known as Precise Puppy. Its a small download of around 140 Mb. There is a detailed tutorial on how to install ( or how NOT to install) Puppy . The pdf has a step step instructions on downloading , burning a CD, and installing Puppy to a USB drive. Since I was new to concept of installing a distro to a USB drive from a live CD, it took me good two hours to get my USB ready. For the second timer, it should take less than 30 mins. It absolutely met its goals. Its actually by far the fastest OS I have ever seen and by miles. And its very stable. It loads up into the RAM on boot up and finally saves the all the work into a file back on the USB drive.

Encouraged with the success of Precise Puppy , I decided to install Slacko, the latest of Puppy’s derivative to the hard disk. 30 mins later , I had slacko running on my hard disk. I am still exploring the new distribution but it sure is the sleekest.

There is a lot to explore about this nice Puppy tomorrow !

Sep 10, 2016- Getting Puppy ready for work and life

Its been three days now I am playing with Slacko Pup. Few things I did to make it work ready on both my (old) personal desktop as well as laptop.

The first thing to check — Google Chrome performs on Puppy ? Particularly on my laptop as it had only 2 GB of RAM. I love light weight browsers but Chrome is (almost) a work necessity for me as my employer uses Google Apps. I can run Google Apps in Firefox but with Chrome I can have two separate profiles- one for work and another for personal. This allows me to keep my book marks , extensions , apps , photos and everything else separately running in two browser windows.

There is a chromium 53 sfs available for Slacko but it somehow didn’t work for me. I downloaded the sfs, moved to /mnt/home and then loaded the sfs to RAM thru GUI tool but no luck (:-

Then I checked Puppy Package Manager , luckily there is a Chrome (version 46) available in Slackware 14.1 repository. This Chrome is obviously older version as Google has stopped supporting 32 bit hardware but it works well for me. I will try to get back to Chromium 53 if the performance of Chrome is somewhat comparable to Firefox on this old machine.

Next thing is to install Hindi Fonts- Lots of my network posts on social media are in Hindi. Slacko can support Hindi (or any other language) if the required fonts are available in /usr/share/fonts/TTF. Here is a shot on how Hindi looks without fonts

After downloading the fonts, here is how it looks (need to restart chrome)

So far so good.

Sep 11, 2016- gParted

gParted is an essential tool for anyone aspiring to try out multiple distributions of Linux. Particularly on old (32 bit) computers where using VMs or containers are just too much of load. gParted is available as an application in most Linux distributions and installation CDs. A separate iso burnt CD is of gParted is also a handy tool.

I first used gParted when I wanted to create a partition for Android Remix. I got confused and ultimately ended up installing Remix on a drive with in Ubuntu. Installing Slacko Puppy gave me another chance to look at gParted and this time I understood the features. Third time master- today when I was installing Xenial Puppy (Yes I did :-) I finally captured some shots.

In the screen shot below , I shrank the sda1 partition to create around 50 GB of allocated space. The partition sda1 is currently running Ubuntu Mate (and Remix with in Ubuntu) , sda2 is running Slacko Pup. As I understand , we can have only four primary partitions on a main drive. For sata drives the names are sda1, sda2, sda3 and sda4. For IDE type of older hard disks the names go hda1 to hda4.

So the first step is to shrink the large sda1 partition to create a space of ~50 gigs. Everything is menu driven (and well documented).

Next step is to allocate this space to a new partition and format the partition to a specific file system. Typical Linux file systems are ext2 , ext3 and ext4. Windows uses fat32 and ntfs.

Notice this is a primary partition , with file systems type ext4. I could have given it a label to identify the way I did for sda2 but that is not mandatory.

Here is how the partitions now look like on my desktop. Since it was a primary partition and there were already two existing , gParted automatically named it sda3. I

Sep 12, 2016- Xenial Pup

Really impressed with the speed and looks of this pup. The icons look like iPhone. Speed is even better :-)

I set up my complete work on Xenial today. Chromium profiles for my personal gmail and work and those of my google apps- smbstack.com , thinkinginsap.com and shutri.com . With five windows of Chromium open , the OS still runs like a Mac Pro (actually faster).

Right-clicking anyplace on the desktop opens the whole menu. The best thing of puppies are those those three partition icons at the left side bottom corner. It takes a click to mount these partitions and pull your data from other installations.

I moved my gNotes data to this puppy (.local folder in home directory). Tried to do the same for Geary . The data for all the mail boxes got copied but somehow Geary is unable to authenticate the accounts. With Xenial , I have access to pretty much all the software on Ubuntu’s latest release. Osmo personal organizer looks very good. Installed Flash plug ins for the local browser. Pup is full of utilities and tools . Need at-least one more day to explore and then back to command line.

Sep 16, 201- Permissions

I think every one knows that Linux is very secure operating system. The core of this security is granular file level permissions . When I started using puppies , I realized that they are pretty much a single user systems . You always work as ‘root’ (admin) in puppies. Before I start comparing the distributions, I realized I know very little about permissions on Linux. Hence the day spent on getting the high level view.

Just like Windows , we can right click on any file to see the permissions tab. Each file has an owner and a group assigned to it .

The GUI view is great if we are looking at permissions for one file at a time. But that’s always not the case. Command line provides a much better view for multiple files at the same time.

The first letter indicates if a special status (eg d for a directory) and rest 9 indicate the read write execute (rwx) permissions for each file for owner , group and everyone respectively. A great article on permissions at Linux command (except it may be little different for the Ubutu where we use sudo for super user) . Btw , sudo is a group also.

Doing a $ cat /etc/group will list all the groups available on the system. The command ‘groups’ shows all the groups current user is a member of . For example below , my user is a member of multiple groups by default. I think you need to be part of group ‘sudo’ to be able to use ‘sudo’ command.

ashutosh@ubuntu:~$ groups
ashutosh adm cdrom sudo dip plugdev lpadmin sambashare

Similarly, command ‘users’ lists all the users on the system. As shown below , my system has only one user

ashutosh@ubuntu:~$ users
ashutosh ashutosh

Command ‘addgroups’ creates new groups.

Traditionally , one directory (and file) belongs only to one group though a user can belong to multiple groups. Thus the only way to provide access to a particular directory (or file) is to add users to the group having access to directory. This causes a problem because if we add a user to a particular group , that user will gain all the privileges (say rwx) that the group has over that directory. There can be a situation that where a user needs to have only read access. This can be handled through access control lists (ACLs). I will dive into ACLs later.

For now this much is boiling my mind.

Sep 28, 2016- Spotify

Music and computing are probably the two wheels of geek mind. Linux has number of iTune ish players but the all of them look a bit stale for the consumption of music is now the streaming service — Pandora , Spotify , SoundCloud . All of these can be streamed from browser but again a native app has a different feel.

Thanks to Spotify community , we now have a native client for Ubuntu. It installs in less than 5 mins and works great though unsupported. Very basic steps

  • Add Spotify repository Key
  • Add repository
  • Update packages
  • Apt get install
  • Enjoy the free never ending stream of music , make and share play lists and have fun

Sep 29, 2016- Upgraded Mate to 1.16

Mate released version 1.16 last week . Decided to give it a shot . There is a very helpful ‘How to’ at tipsonubuntu.com that I used to get this version from un-offcial PPA. Installation was a breeze except that I lost the menu icon on top left corner. I later figured out that menu icon can be added in the panel configuration.

Also clean installed LibreOffice 5.2.2 (Fresh). Took me a while to understand if I should uninstall the previous version. After bit of research , I decided to go for a clean install. That is remove the previous version and then download / install the new one.

Nov 27, 2016- Slackware.

Its been a while since I wrote on this thread. Not cuz I stopped using (or learning) Linux. It was cuz I decided to switch to Slackware. It turned out to be a whole new paradigm. I first installed Slack on my laptop. Installation was NOT as simple as Ubuntu but it was meaningful. It felt like there is someone wanted to explain the guts of an operating system and in return demanded dedication. I found myself willing to invest that time. And also found myself sucked into it. I wouldn’t call it cult behavior though it would be close.

After running Slack couple of weeks on my laptop, I decided to switch my desktop too. Its been almost a month now that both my home machines are running Slackware , and looks like there is no going back. What I learned in last month and half is more than previous four months of my Linux journey. Slackware is a totally unassuming distribution. Despite being the oldest, its probably among the least automated one. But that doesn’t mean its broken. Every aspect of this distribution is well thought thru. It demands us to know what we are doing. And provides options to steer your system exactly the way you want. Some of the thing that Slack philosophy reinforces are ..

  • User is NOT dumb
  • Simplicity is NOT deciding (and thus forcing) your choices on users.
  • Simplicity is also NOT dumbing down choices.
  • Simplicity is NOT a shiny interface.
  • Slackware ( or any other system) can’t be for everyone. Trying to build something that meets everyone’s aspiration is a compromise that helps none.
  • Little effort goes a long way in knowing what you are doing.
  • It takes few iterations to get the vibe; but once you get it , its so much fun.
  • You don’t need millions of applications.

July 26, 2019

Its been almost two and half years since I started this story and of course started my journey with Linux. I am pretty stable on Slackware these days. Most of my writing work has moved to github. Writing had never been so much fun.

I have done a detailed post on my writing tools and plumbing. Check out Write Like a coder

 by the author.


Written by


Deep into business apps, technology trends and investing strategies. social believer.

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade