The simple formulation applied to solve the problem i.e. providing estimates or guesses is referred as estimation theory. For example if we want to take out the result of noisy observations, then a guess can be applied to take out the magnitude of the noise. There can be no direct observation to provide the magnitude but some functional relationships can help to extract the magnitude of the noise.
CLASSIFICATION OF ESTIMATION THEORY
The theory is classified under the below given categories-
Non – parametric parameters
Parametric parameters – The main role played in these parameters is of the probability density function for the data. Such sorts of estimators are dependent on the statics data which is provided in a proper form.
Non – parametric parameters – These parameters are not dependent on any data but partially dependent on the true statistics provided to the estimators.
Classical estimation – the vector and the unknown parameter exists in this estimation and the result is that the density function is parameterized.
Bayesian estimation – In this estimation prior knowledge is given to the estimator about the magnitude to be estimated for the density function.
PROCESS OF ESTIMATION
The main process of estimation involves to reach the estimator. He takes the data as an input which is measured on various scales and produces the estimation on various parameters. It involves the following steps –
The first purpose is to analyse the probability of the given data measured on various scales and the dependence of the unknown parameters of interest. In statistics some parameters are estimated and the errors and distractions are extracted on the basis of data. The probability distribution is dependent on these parameters on which the physical model is prepared.
The model is presented to the estimator to provide an achievable precision on the basis of theory.
Next, is the availability of estimator which is ascertain through various methods.
The final process includes experiments and simulations to test the performance of the estimator.
METHODS OF DETECTING ESTIMATORS
Cramer Rao Lower Bound method - It provides minimum variance which can be achieved from an unbiased estimator.
Minimum variance unbiased estimator – Sometimes due to non availability of the efficient estimator, there is an availability of minimum variance unbiased estimator. The method to extract such an estimator is Rao – Blackwell –Lehman Scheffe method.
Best linear unbiased estimator – In this method suboptimal schemes are used for an estimator where the estimator is placed to be on a linear function of the data.
Maximum likelihood estimator – In this method the estimator presents the asymptotic optimality properties which are a source of large interest of the people.
Estimation by means of moments method – it considers a very simple approach being devoid of any optimality properties and the satisfaction lies where is availability of large sample size.
On Time Delivery
Plagiarism Free Work
24 X 7 Live Help
Services For All Subjects
Best Price Guarantee
If you ever need to feel confident about your work, you need to come here and try their services out. Without a second thought, the team’s service is the best you can ever acquire.
You need to explore and try their services out to believe in the value and expertise they have to render you. A team of finesse makes them the best in the field.
Experiencing improved academic assistance might seem so difficult, but no more. The experts help you attain the best of everything and at a pace faster than others.
Although my teacher helps me with the assignment, still I need an expert’s advice for helping me understand the topic clearly. With the help of these professionals, I got exactly what I wanted.