Definition of an Algorithm: An algorithm is any well-defined computational procedure that is composed of a finite set of steps and takes some value or set of values as input and produces some value or set of values as output in a finite length of time.
Properties of an Algorithm: An algorithm must satisfy the following criteria-
1) Input: Zero or more quantities are externally supplied.
2) Output: At least one quantity is produced.
3) Definiteness: Each instruction is clear and unambiguous.
4) Finiteness: If we trace out the instructions of an algorithm, then for all cases the algorithm terminates all cases, the algorithm terminates after a finite number of steps.
5) Effectiveness: Every Instruction must be very basic so that it can be carried out in principle, by a person using only pencil and paper. It is not enough that each operation definite as in criteria 3. It is also must be feasible.
Structure of an Algorithm
2) Assignment Step
3) Decision Step
4) Repetitive Step
5) Output Step
Analysis of Algorithms
Efficiency of an algorithm can be measured in terms of:
• Execution time (time complexity)
• The amount of memory required (space complexity)
Time Complexity: A measure of the amount of time required to execute an algorithm.
a) Analysis is based on the amount of work done by the algorithm
b) Time complexity expresses the relationship between the size of the input and the run time for the algorithm
c) Usually expressed as a proportionality, rather than an exact function
· Factors that should not affect time complexity analysis:
1) The programming language chosen to implement the algorithm.
2) The quality of the compiler.
3) The speed of the computer on which the algorithm is to be executed
· Factors that affects Time Complexity:
1) Characteristics of the computer system (e.g. processor speed, amount of memory, ﬁle-system type, etc.)
2) The way the algorithm is implemented
3) The particular instance of data the algorithm is operating on (e.g., amount of data, type of data).
4) Number of arithmetic operations performed
5) Number of comparisons made
6) Number of times through a critical loop
7) Number of array elements accessed
Asymptotic Notations: Asymptotic notations are used to make meaningful statement about the efficiency of algorithms. These notations introduce some terminology that enables us to make meaningful statements about the time and space complexities of an algorithm. Commonly used asymptotic notations are:
1. Big-Oh Notation (Upper Bound)
2. Omega Notation (Lower Bound)
3. Theta Notation (Tight Bound)
4. Little-oh Notation
5. Little-omega Notation
Definition: The function f(n) = O(g(n)) (read as “f of n is big-oh of g of n”) iff there exist positive constants c and n0 such that f(n) ≤ c * g(n) for all n≥n0.
Definition: The function f(n) = Ω(g(n)) (read as “f of n is omega of g of n”) iff there exist positive constants c and n0 such that f(n) ≥ c * g(n) for all n≥n0.
Definition: The function f(n) = Ɵ(g(n)) (read as “f of n is theta of g of n”) iff there exist positive constants c1, c2 and n0 such that c1* g(n) ≤ f(n) ≤ c2 * g(n) for all n≥n0.