Monday 31 August 2015

Sed Tetris by uuner

This script is the best of example to prove the power of bash and sed. :-)

A game of tetris written using sed and bash by uuner for your shell.

The game has all the features of rotating blocks, increasing blocking speed and all.


So, shell scripts are not limited to automate your tasks, you can even create game using your shell scripts. There are many games available using the shell script, like snakes, tanks, and many more.

The code for sed tetris is available on the page:

http://uuner.livejournal.com/55238.html

Thank you.

Saturday 29 August 2015

Auto update upgrade your Atom editor Package

So, atom is one of my favorite text editor these days, and it has great community. You can download and read it more about the atom here: https://atom.io/



Unfortunately, it is difficult to update the atom, as it is not available in ubuntu software center, and there is no update option available for the atom yet.

So, i have found a script over askubuntu.com and it's very useful. You can use it for downloading, updating, and upgrading your atom package. You can add the same in your cron job.

Here is the code, save it as atom-auto-update.sh
and make it executable.




Save it and run it. And feel the power of new upgraded atom.

Thank you.

One liner to remove duplicate entries using awk

Hello Everyone,

Today we will discuss about one liner for removing
Sometime, we have a situation where we have the file with lots of repetitive or duplicate entries. You need to remove all those entries or count the number of those repetitive elements.

Let's suppose we have a file with below content,

my_temp_file:

this
is
a
test
file
a
file
with
repetitive
content.
Ways
to
remove
duplicate
duplicate
lines

Now, to remove the duplicate entries, use the below awk one liner:

$ awk '!a[$0]++' my_temp_file
this
is
a
test
file
with
repetitive
content.
Ways
to
remove
duplicate
lines


Now, if you don't bother about the order, you can use sort and uniq in below manner:

$ cat one-liners/my_temp_file | sort | uniq
a
content.
duplicate
file
is
lines
remove
repetitive
test
this
to
Ways
with

It will do the same, but don't preserve the order of file.

You can watch the YouTube video here:



Thank you

Add your comments for doubts.