Skip to content
  • About
  • CoursesExpand
    • Problem Solving using C Language
    • Mastering Database Management
    • Linux System Administration
    • Linux and Shell Programming
  • Publications
  • Professional Certificates
  • BooksExpand
    • Books Authored
  • Patents
Download CV
Unix

Unix Variable $* and $# Difference

Script

#!/bin/bash

echo “Using \”\$*\”:”
for a in “$*”; do
echo $a;
done

echo -e “\nUsing \$*:”
for a in $*; do
echo $a;
done

echo -e “\nUsing \”\$@\”:”
for a in “$@”; do
echo $a;
done

echo -e “\nUsing \$@:”
for a in $@; do
echo $a;
done

Run

variabledif.sh one two “three four”

The explanation and the results for the four cases are below.

In the first case, the parameters are regarded as one long quoted string:

Using “$*”:
one two three four

Case 2 (unquoted) – the string is broken into words by the for loop:

Using $*:
one
two
three
four

Case 3 – it treats each element of $@ as a quoted string:

Using "$@":
one
two
three four

The last case – it treats each element as an unquoted string, so the last one is again split by what amounts to for three four:

Using $@:
one
two
three
four

Post navigation

Previous Previous
Unix Comparison Operator
NextContinue
awk Programming
Latest

Advance AI PPT

Read More Advance AI PPTContinue

Latest

Prompts for Image Descriptions

Describe the scene using three vivid sensory details — one for sight, one for sound, and one for touch. Summarize the mood of the image…

Read More Prompts for Image DescriptionsContinue

Latest

Dimensionality Reduction

Dimensionality reduction is the process of reducing the number of features (variables) in a dataset while preserving important information. It helps in: ✅ Reducing computational…

Read More Dimensionality ReductionContinue

Artificial Intelligence

Tanh Function in Neural Network

The tanh function, short for hyperbolic tangent function, is another commonly used activation function in neural networks. It maps any real-valued number into a value…

Read More Tanh Function in Neural NetworkContinue

Latest

Why Initialize Weights in Neural Network

Initializing weights and biases is a crucial step in building a neural network. Proper initialization helps ensure that the network converges to a good solution…

Read More Why Initialize Weights in Neural NetworkContinue

Nishant Munjal

Coding Humanity’s Future </>

Facebook Twitter Linkedin YouTube Github Email

Tools

  • SIP Calculator
  • Write with AI
  • SamplePHP
  • Image Converter

Resources

  • Blog
  • Contact
  • Refund and Returns

Legal

  • Disclaimer
  • Privacy Policy
  • Terms and Conditions

© 2025 - All Rights Reserved

  • About
  • Courses
    • Problem Solving using C Language
    • Mastering Database Management
    • Linux System Administration
    • Linux and Shell Programming
  • Publications
  • Professional Certificates
  • Books
    • Books Authored
  • Patents
Download CV
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.Ok