Definition:Histogram
Jump to navigation
Jump to search
Definition
A histogram is a form of bar chart used to represent the frequency distribution grouped by class intervals.
The $x$-axis is divided into segments whose lengths are proportional to the lengths of the corresponding class intervals.
The bars are then constructed so that their areas are proportional to the population numbers in the corresponding class.
Hence if all the class intervals are equal the heights of the bars are proportional to the numbers.
Also see
- Results about histograms can be found here.
Sources
- 1994: George E.P. Box, Gwilym M. Jenkins and Gregory C. Reinsel: Time Series Analysis: Forecasting and Control (3rd ed.) ... (previous) ... (next):
- Part $\text {I}$: Stochastic Models and their Forecasting:
- $2$: Autocorrelation Function and Spectrum of Stationary Processes:
- $2.1$ Autocorrelation Properties of Stationary Models:
- $2.1.2$ Stationary Stochastic Processes: Mean and variance of a stationary process
- $2.1$ Autocorrelation Properties of Stationary Models:
- $2$: Autocorrelation Function and Spectrum of Stationary Processes:
- Part $\text {I}$: Stochastic Models and their Forecasting:
- 1998: David Nelson: The Penguin Dictionary of Mathematics (2nd ed.) ... (previous) ... (next): histogram
- 2008: David Nelson: The Penguin Dictionary of Mathematics (4th ed.) ... (previous) ... (next): histogram
- 2021: Richard Earl and James Nicholson: The Concise Oxford Dictionary of Mathematics (6th ed.) ... (previous) ... (next): histogram