Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
tips:threelinestip [2008/02/06 12:44] 86.58.4.25 |
tips:threelinestip [2010/10/21 23:28] a |
||
---|---|---|---|
Line 2: | Line 2: | ||
Do you have a tip/hack that's max. three lines long?? | Do you have a tip/hack that's max. three lines long?? | ||
+ | === How do I capture the output of “top” to a file? === | ||
+ | |||
+ | The command top is a very useful tool to capture information about process running on Linux. Many times this information may need to be captured to a file. This can be done with the following command: | ||
+ | |||
+ | |||
+ | top -b -n1 > / | ||
+ | |||
+ | This will run top once write the output to a file and exit. | ||
+ | |||
+ | The command top can also be run so that it will give multiple reports. To run top 5 times and wait 5 seconds between each output the command would be: | ||
+ | |||
+ | |||
+ | top -b -n5 -d5 > / | ||
+ | |||
+ | |||
+ | === Removing whitespaces from filename === | ||
- | **Removing whitespaces from filename** | ||
# i in *.[jJ][pP][gG]; | # i in *.[jJ][pP][gG]; | ||
ali | ali | ||
Line 9: | Line 24: | ||
:) | :) | ||
- | **How to get Webserver Version** | + | === How to get Webserver Version |
# printf "HEAD / HTTP/ | # printf "HEAD / HTTP/ | ||
+ | |||
+ | === Spliting .tgz file on the fly === | ||
+ | |||
+ | tar -czf /dev/stdout $(DIRECTORY_OR_FILE_TO_COMPRESS) | split -d -b $(CHUNK_SIZE_IN_BYTES) - $(FILE_NAME_PREFIX) | ||
+ | putting them together agian | ||
+ | cat $(FILE_NAME_PREFIX)* >> /dev/stdout | tar -xzf /dev/stdin | ||
+ | |||
+ | === Compile on SMP machines === | ||
+ | |||
+ | export MAKEFLAGS=" | ||
+ | |||
+ | === Patch & Diff === | ||
+ | $ diff -c prog.c.old prog.c > prog.patch | ||
+ | $ patch < prog.patch | ||
+ | |||
+ | |||
+ | ===== Printing network traffic ===== | ||
+ | % sudo tcpdump -Atq -s 0 -i en1 | ||
+ | |||
+ | -i en1 will display traffic on your airport card. Use en0 (or nothing, for most systems) for built-in ethernettage. You can add a host line to limit output to a particular host | ||
+ | |||
+ | % sudo tcpdump -Atq -s 0 -i en1 host foobar.com | ||
+ | |||
+ | ===== Couting openfiles per user ===== | ||
+ | |||
+ | # lsof | grep ' root ' | awk ' | ||
+ | |||
+ | Of course, if you want to drop the count and show the actual processes, you can run: | ||
+ | |||
+ | # lsof | grep ' root ' | ||
+ | |||
+ | ===== How to check is service is tcpwrapper enable ===== | ||
+ | # ldd `which sshd` | grep -i libwrap | ||
+ | |||
+ | or | ||
+ | |||
+ | # strings `which sshd` | grep -i libwrap | ||
+ | |||
+ | |||
+ | ===== Comparing files & taking some action based on outcome ===== | ||
+ | **Method 1** | ||
+ | < | ||
+ | cmp -s file1 file2 || { | ||
+ | # do something | ||
+ | } | ||
+ | </ | ||
+ | **Method 2** | ||
+ | < | ||
+ | cmp -s file1 file2 | ||
+ | if [ $? = 1 ] | ||
+ | then | ||
+ | # do something | ||
+ | fi | ||
+ | </ | ||
+ | |||
+ | ===== IPv4 parsing / sorting ===== | ||
+ | |||
+ | egrep ' | ||
+ | |||
+ | and | ||
+ | |||
+ | sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4 | ||
+ | |||
+ | |||
+ | |||
+ | ===== finding a string in a " | ||
+ | |||
+ | tail -f filename |grep --line-buffered | ||
+ | |||
+ | |||
+ | ===== Local/ | ||
+ | |||
+ | Serve files on port 8080 for anybody from the directory from where you start this command: | ||
+ | |||
+ | <code bash|> | ||
+ | :;while [ $? -eq 0 ];do nc -vlp 8080 -c' | ||
+ | "; | ||
+ | </ | ||
+ | |||
+ | |||
+ | ===== resolving IP Addresse (nmap) ===== | ||
+ | |||
+ | <code bash|> | ||
+ | nmap -sL $1 2>/ | ||
+ | perl -ne 'print unless /^Host [\d.]+ /' | | ||
+ | grep 'not scanned' | ||
+ | cut -d ' ' -f 2,3 | | ||
+ | sed -e ' | ||
+ | </ | ||
+ | |||
+ | output: | ||
+ | < | ||
+ | 198.133.219.10 resolves to fed.cisco.com | ||
+ | 198.133.219.11 resolves to asp-web-sj-1.cisco.com | ||
+ | 198.133.219.12 resolves to asp-web-sj-2.cisco.com | ||
+ | 198.133.219.13 resolves to fedtst.cisco.com | ||
+ | 198.133.219.14 resolves to www.netimpactstudy.com | ||
+ | 198.133.219.15 resolves to deployx-sj.cisco.com | ||
+ | 198.133.219.16 resolves to contact-sj1.cisco.com | ||
+ | 198.133.219.17 resolves to scc-sj-1.cisco.com | ||
+ | 198.133.219.18 resolves to scc-sj-2.cisco.com | ||
+ | 198.133.219.19 resolves to scc-sj-3.cisco.com | ||
+ | 198.133.219.20 resolves to jmckerna-test.cisco.com | ||
+ | 198.133.219.21 resolves to events.cisco.com | ||
+ | 198.133.219.22 resolves to bam-prod-1.cisco.com | ||
+ | 198.133.219.23 resolves to redirect.cisco.com | ||
+ | 198.133.219.25 resolves to www.cisco.com | ||
+ | 198.133.219.26 resolves to partners.cisco.com | ||
+ | </ | ||
+ | |||
+ | |||
+ | ===== Linux Get Hardware Serial Number From Command Line ===== | ||
+ | | ||
+ | |||
+ | ===== Convert pdf to jpg ===== | ||
+ | |||
+ | for file in `ls *.pdf`; do convert -verbose -colorspace RGB -resize 800 -interlace none \ | ||
+ | -density 300 -quality 80 $file `echo $file | sed ' | ||
+ | |||
+ | ===== Find duplicate files in Linux ===== | ||
+ | |||
+ | Let’s say you have a folder with 5000 MP3 files you want to check for duplicates. Or a directory containing thousands of EPUB files, all with different names but you have a hunch some of them might be duplicates. You can cd your way in the console up to that particular folder and then do a | ||
+ | |||
+ | < | ||
+ | find -not -empty -type f -printf “%s\n” | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate | ||
+ | </ | ||
+ | |||
+ | This will output a list of files that are duplicates, according tot their HASH signature. | ||
+ | Another way is to install fdupes and do a | ||
+ | |||
+ | | ||
+ | |||
+ | The -r is for recursivity. Check the duplicates_list.txt afterwards in a text editor for a list of duplicate files. | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | |||
+ |