Skip to main content

20 Essential Linux Shell Text Processing Tips

·322 words·2 mins
Text Processing Linux Shell
Table of Contents

Whether you are a system administrator, a developer, or a power user, mastering Shell-based text processing is a foundational skill. These tools help extract signal from noise, automate repetitive tasks, and dramatically improve command-line productivity.


📁 File Search & Handling
#

  1. Find specific files

    find /path -name "filename"
    
  2. Search and delete files

    find /path -name "*.log" -delete
    
  3. Find files by type

    find /path -type f -name "*.txt"
    

    Finds regular files only.

  4. Recursive text search

    grep -R "text" /path
    
  5. Case-insensitive search

    grep -i "text" file
    

🔃 Sorting & De-duplication
#

  1. Sort text alphabetically

    sort file.txt
    
  2. Reverse sort

    sort -r file.txt
    
  3. Remove duplicate lines

    sort file.txt | uniq
    

    uniq requires sorted input to detect all duplicates.

  4. Count unique entries

    sort file.txt | uniq -c
    

🔤 Character Conversion & Extraction
#

  1. Character replacement

    echo "apple" | tr 'a' 'A'
    

    Output: Apple

  2. Delete characters

    echo "text123" | tr -d '0-9'
    
  3. Extract specific columns

    cut -d, -f2 file.csv
    

    Extracts the second column using , as delimiter.

  4. Merge file columns

    paste file1.txt file2.txt
    

📊 Statistics & Formatting
#

  1. Count lines, words, and bytes

    wc file.txt
    

    Use -l to count lines only.

  2. Count specific word occurrences

    grep -o "word" file.txt | wc -l
    

✏️ Text Substitution & Pattern Matching
#

  1. Global text replacement

    sed 's/old/new/g' file.txt
    
  2. Conditional replacement

    sed '/pattern/s/old/new/' file.txt
    

    Applies replacement only on matching lines.


🧠 Advanced Stream Processing with awk
#

  1. Print specific columns

    awk '{print $1}' file.txt
    
  2. Conditional filtering

    awk '$1 > 10' file.txt
    
  3. Custom field separators

    awk -F: '{print $1}' /etc/passwd
    

🧰 Core Tool Summary
#

Tool Primary Purpose
grep Search for patterns in text
sed Stream editing and substitution
awk Pattern scanning and data processing
tr Character translation or deletion
sort Line-based sorting

Shell text processing tools may look simple, but together they form a powerful data-processing pipeline. By combining them effectively, you can handle large datasets, automate system tasks, and turn the Linux command line into a precise and expressive productivity engine.

Related

Linux Pipes Explained: Practical Examples for Everyday Use
·547 words·3 mins
Linux Command Line Shell
Batch IPMI BMC Management on Linux
·591 words·3 mins
Linux BMC IPMI
8 Fun Linux Console Games and Terminal Toys
·394 words·2 mins
Linux Console CLI Games