The simple formulation applied to solve the problem i.e. providing estimates or guesses is referred as estimation theory. For example if we want to take out the result of noisy observations, then a guess can be applied to take out the magnitude of the noise. There can be no direct observation to provide the magnitude but some functional relationships can help to extract the magnitude of the noise.
CLASSIFICATION OF ESTIMATION THEORY
The theory is classified under the below given categories-
Non – parametric parameters
Parametric parameters – The main role played in these parameters is of the probability density function for the data. Such sorts of estimators are dependent on the statics data which is provided in a proper form.
Non – parametric parameters – These parameters are not dependent on any data but partially dependent on the true statistics provided to the estimators.
Classical estimation – the vector and the unknown parameter exists in this estimation and the result is that the density function is parameterized.
Bayesian estimation – In this estimation prior knowledge is given to the estimator about the magnitude to be estimated for the density function.
PROCESS OF ESTIMATION
The main process of estimation involves to reach the estimator. He takes the data as an input which is measured on various scales and produces the estimation on various parameters. It involves the following steps –
The first purpose is to analyse the probability of the given data measured on various scales and the dependence of the unknown parameters of interest. In statistics some parameters are estimated and the errors and distractions are extracted on the basis of data. The probability distribution is dependent on these parameters on which the physical model is prepared.
The model is presented to the estimator to provide an achievable precision on the basis of theory.
Next, is the availability of estimator which is ascertain through various methods.
The final process includes experiments and simulations to test the performance of the estimator.
METHODS OF DETECTING ESTIMATORS
Cramer Rao Lower Bound method - It provides minimum variance which can be achieved from an unbiased estimator.
Minimum variance unbiased estimator – Sometimes due to non availability of the efficient estimator, there is an availability of minimum variance unbiased estimator. The method to extract such an estimator is Rao – Blackwell –Lehman Scheffe method.
Best linear unbiased estimator – In this method suboptimal schemes are used for an estimator where the estimator is placed to be on a linear function of the data.
Maximum likelihood estimator – In this method the estimator presents the asymptotic optimality properties which are a source of large interest of the people.
Estimation by means of moments method – it considers a very simple approach being devoid of any optimality properties and the satisfaction lies where is availability of large sample size.
On Time Delivery
Plagiarism Free Work
24 X 7 Live Help
Services For All Subjects
Best Price Guarantee
I am thankful to them for helping me in every step of my university life. The three years were majorly bliss, only for them.
I highly appreciate the way they helped me gain clarity of my assignment. Although the topic was too complex, however, I managed to grab an idea and all thanks to the team.
I am impressed with the helpful nature of the scholars where they took care of my requirement from the first step to the last.
I was hesitant to acquire the service and invest in this, however, once I attained it, I feel I made the right move. Each and every penny invested is worth it!