Numerical Program

Standard Deviation Method Definition

A measure of dispersion in a frequency distribution, equal to the square root of the mean of the squares of the deviations from the arithmetic mean of the distribution. The square root of the variance. A measure of dispersion of a set of data from its mean.

Program of Standard Deviation in C Language

  #include<stdio.h>
  #include<conio.h>
  void main()
 {
	int a[100],s=0,sd,b,s1=0,mean,n,i;
	clrscr();
	printf("Enter the Frequency :- ");
	scanf("%d",&n);
	printf("Enter %d Values in Array :- ",n);
	for(i=0; i<n; i++)
	{
	scanf("%d",&a[i]);
	}
	for(i=0; i<n; i++)
	{
	s=s+a[i];
	}
	mean=s/n;
	for(i=0; i<n; i++)
	{
	b=a[i]-mean;
	s1=s1+b*b;
	}
	s1=s1/n;
	sd=sqrt(s1);
	printf("Standard Deviation :- %d",sd);
	getch();
 }
    

Output

Post Your Comment

Some Useful Tricks of Windows

Follow us on Facebook

Advertisement