An Intelligence Quotient or IQ is a score derived from one of several different
standardized tests attempting to measure intelligence. The term "IQ," a translation of the German Intelligenz-Quotient, was coined by the German psychologist William Stern in 1912 as a proposed method of scoring early modern children's intelligence
tests such as those developed by Alfred Binet and Theodore Simon in the early 20th Century. Although the term "IQ" is still in common use,
the scoring of modern IQ tests such as the Wechsler Adult Intelligence Scale is now based on a projection of the subject's measured rank on the Gaussian bell curve with a center value (average IQ) of 100, and a standard deviation of 15 (different tests have various standard deviations; the Stanford-Binet IQ test has a standard deviation of 16).
IQ scores have been shown to correlate with such factors as morbidity and mortality, parental social status, and to a substantial degree, parental IQ. While
its inheritance has been investigated for nearly a century, controversy remains as to how much is inheritable, and the mechanisms of inheritance are still a matter of some debate.
IQ scores are used in many contexts: as predictors of educational achievement or special needs, by social scientists who study the distribution
of IQ scores in populations and the relationships between IQ score and other variables, and as predictors of job performance
and income.
The average IQ scores for many populations have been rising at an average
rate of three points per decade since the early 20th century with most of the increase in the lower half of the IQ range:
a phenomenon called the Flynn effect. It is disputed whether these changes in scores reflect real changes in intellectual
abilities, or merely methodological problems with past or present testing.