Differences
This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
tips:threelinestip [2009/03/29 21:58] a |
tips:threelinestip [2015/01/07 07:47] (current) mrizvic run local/remote webserver with python |
||
|---|---|---|---|
| Line 106: | Line 106: | ||
| </ | </ | ||
| + | ===== Local/ | ||
| + | |||
| + | Serve files on port 8000 for anybody from the directory from where you start this command: | ||
| + | |||
| + | python 2.x | ||
| + | <code bash|> | ||
| + | python -m SimpleHTTPServer | ||
| + | </ | ||
| + | |||
| + | python 3.x | ||
| + | <code bash|> | ||
| + | python -m http.server | ||
| + | </ | ||
| + | |||
| + | If other port is desired (for example 9000) then add port number to the command: | ||
| + | <code bash|> | ||
| + | python -m SimpleHTTPServer 9000 | ||
| + | </ | ||
| ===== resolving IP Addresse (nmap) ===== | ===== resolving IP Addresse (nmap) ===== | ||
| Line 146: | Line 164: | ||
| -density 300 -quality 80 $file `echo $file | sed ' | -density 300 -quality 80 $file `echo $file | sed ' | ||
| + | ===== rename upper to lowercase in bash ==== | ||
| + | |||
| + | for x in *.JPG; do y=$(echo $x | tr ' | ||
| + | ===== Find duplicate files in Linux ===== | ||
| + | |||
| + | Let’s say you have a folder with 5000 MP3 files you want to check for duplicates. Or a directory containing thousands of EPUB files, all with different names but you have a hunch some of them might be duplicates. You can cd your way in the console up to that particular folder and then do a | ||
| + | |||
| + | < | ||
| + | find -not -empty -type f -printf “%s\n” | sort -rn | uniq -d | xargs -I{} -n1 find -type f -size {}c -print0 | xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate | ||
| + | </ | ||
| + | |||
| + | This will output a list of files that are duplicates, according tot their HASH signature. | ||
| + | Another way is to install fdupes and do a | ||
| + | |||
| + | | ||
| + | |||
| + | The -r is for recursivity. Check the duplicates_list.txt afterwards in a text editor for a list of duplicate files. | ||
| + | |||
| + | ===== Linux - Top 10 CPU-hungry apps ===== | ||
| + | |||
| + | ps -eo pcpu, | ||
| + | ===== Create static mirror of dynamic web site (ex. Wordpress) ===== | ||
| + | wget --mirror -w 2 -p -r -np --html-extension --convert-links -R xmlrpc.php, | ||
| + | |||
| + | ===== Find processes utilizing high memory in human readable format ====== | ||
| + | ps -eo size, | ||

