Three lines tip

Do you have a tip/hack that's max. three lines long??

How do I capture the output of “top” to a file?

The command top is a very useful tool to capture information about process running on Linux. Many times this information may need to be captured to a file. This can be done with the following command:

  top -b -n1 > /tmp/top.txt

This will run top once write the output to a file and exit.

The command top can also be run so that it will give multiple reports. To run top 5 times and wait 5 seconds between each output the command would be:

 top -b -n5 -d5 > /tmp/top.txt

Removing whitespaces from filename

 # i in *.[jJ][pP][gG]; do mv "$i" $(echo "$i" | sed "s/ /_/g"); done


 # rename 's/ /_/' *.[jJ][pP][gG]


How to get Webserver Version

 # printf "HEAD / HTTP/1.0nn" | nc $server 80 | grep Server

Spliting .tgz file on the fly

 tar -czf /dev/stdout $(DIRECTORY_OR_FILE_TO_COMPRESS) | split -d -b $(CHUNK_SIZE_IN_BYTES) - $(FILE_NAME_PREFIX)

putting them together agian

 cat $(FILE_NAME_PREFIX)* >> /dev/stdout | tar -xzf /dev/stdin

Compile on SMP machines

export MAKEFLAGS=“-j3”

Patch & Diff

 $ diff -c prog.c.old prog.c > prog.patch
 $ patch < prog.patch

Printing network traffic

 % sudo tcpdump -Atq -s 0 -i en1

-i en1 will display traffic on your airport card. Use en0 (or nothing, for most systems) for built-in ethernettage. You can add a host line to limit output to a particular host

 % sudo tcpdump -Atq -s 0 -i en1 host

Couting openfiles per user

 # lsof | grep ' root ' | awk '{print $NF}' | sort | wc -l

Of course, if you want to drop the count and show the actual processes, you can run:

 # lsof | grep ' root '

How to check is service is tcpwrapper enable

 # ldd `which sshd` | grep -i libwrap


 # strings `which sshd` | grep -i libwrap

Comparing files & taking some action based on outcome

Method 1

cmp -s file1 file2 || {
  # do something
Method 2
cmp -s file1 file2
if [ $? = 1 ]
  # do something

IPv4 parsing / sorting

 egrep '([[:digit:]]{1,3}\.){3}[[:digit:]]{1,3}'


 sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4

finding a string in a "live" stream

tail -f filename |grep –line-buffered STRING

Local/remote webserver

Serve files on port 8080 for anybody from the directory from where you start this command:

:;while [ $? -eq 0 ];do nc -vlp 8080 -c'(r=read;e=echo;$r a b c;z=$r;while [ ${#z} -gt 2 ];do $r z;done;f=`$e $b|sed 's/[^a-z0-9_.-]//gi'`;h="HTTP/1.0";o="$h 200 OK\r\n";c="Content";if [ -z $f ];then($e $o;ls|(while $r n;do if [ -f "$n" ]; then $e "<a href=\"/$n\">`ls -gh $n`
";fi;done););elif [ -f $f ];then $e "$o$c-Type: `file -ib $f`\n$c-Length: `stat -c%s $f`";$e;cat $f;else $e -e "$h 404 Not Found\n\n404\n";fi)';done

Local/remote webserver #2 (python)

Serve files on port 8000 for anybody from the directory from where you start this command:

python 2.x

python -m SimpleHTTPServer 

python 3.x

python -m http.server

If other port is desired (for example 9000) then add port number to the command:

python -m SimpleHTTPServer 9000

resolving IP Addresse (nmap)

nmap -sL $1 2>/dev/null |
perl -ne 'print unless /^Host [\d.]+ /' |
grep 'not scanned' |
cut -d ' ' -f 2,3 |
sed -e 's/\(.*\) (\(.*\))/\2 resolves to \1/'

output: resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to resolves to

Linux Get Hardware Serial Number From Command Line

 dmidecode |grep -i serial

Convert pdf to jpg

for file in `ls *.pdf`; do convert -verbose -colorspace RGB -resize 800 -interlace none \
  -density 300 -quality 80 $file `echo $file | sed 's/\.pdf$/\.jpg/'`; done

rename upper to lowercase in bash

 for x in *.JPG; do y=$(echo $x | tr '[A-Z]' '[a-z]'); echo $y; mv $x $y; done

Find duplicate files in Linux

Let’s say you have a folder with 5000 MP3 files you want to check for duplicates. Or a directory containing thousands of EPUB files, all with different names but you have a hunch some of them might be duplicates. You can cd your way in the console up to that particular folder and then do a

find -not -empty -type f -printf “%s\n” | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate

This will output a list of files that are duplicates, according tot their HASH signature. Another way is to install fdupes and do a

 fdupes -r ./folder > duplicates_list.txt

The -r is for recursivity. Check the duplicates_list.txt afterwards in a text editor for a list of duplicate files.

Linux - Top 10 CPU-hungry apps

 ps -eo pcpu,pid,args | sort -k 1 -r | head -10

Create static mirror of dynamic web site (ex. Wordpress)

 wget --mirror -w 2 -p -r -np --html-extension --convert-links -R xmlrpc.php,trackback <URL>

Find processes utilizing high memory in human readable format

 ps -eo size,pid,user,command | sort -rn | head -5 | awk '{ hr[1024^2]="GB"; hr[1024]="MB"; for (x=1024^3; x>=1024; x/=1024) { if ($1>=x) { printf ("%-6.2f %s ", $1/x, hr[x]); break } } } { printf ("%-6s %-10s ", $2, $3) } { for ( x=4 ; x<=NF ; x++ ) { printf ("%s ",$x) } print ("\n") }'
tips/threelinestip.txt · Last modified: 2015/01/07 07:47 by mrizvic
CC Attribution-Noncommercial-Share Alike 4.0 International
Valid CSS Driven by DokuWiki do yourself a favour and use a real browser - get firefox!! Recent changes RSS feed Valid XHTML 1.0 ipv6 ready