We have added a Word Frequency Analyser in the Members' Area. You simply copy and paste the text you want analysed in the box and click Calculate Now! and the results will be displayed automatically. The first box will display the following information:
Total Word Count 59
Total Unique Words 45
Number of Sentences 4
Average Words per Sentence 14.8
Lexical Density 76.27%
Fog Index 5.90
The first four are all fairly obvious, but the last two might be less familiar.
Lexical Density is calculated by the following formula:
(Number of different words / Total number of words) x 100
What it tries to show is the concentration of vocabulary and the higher the figure the more varied the vocabulary used. However, short texts tend to have high scores, which decline as texts grow in length. This is because words get repeated and thingslike pronouns start replacing nouns, etc. The text I entered to get this result was a single paragraph. Here are the results with a longer text, an undergraduate academic essay:
Total Word Count 3058
Total Unique Words 1090
Number of Sentences 129
Average Words per Sentence 23.7
Lexical Density 35.64%
Fog Index 9.48
The word of was used 127 times, for instance, so the lexical density is generally lower in longer texts.
The Fog Index is calculated as follows:
(Average No. of words in sentences / Percentage of words of three or more syllables) x 0.4
The result is supposed to equal the number of years required in education to be able to understand the text.
Technical documentation is generally aroung 10-15, which is very close to the second score.
Neither test should be used as much more than a rough guide, but they do show some interesting results. Some linguists state that tests like Lexical Density are only statistically valid when random sections of identical length are compared. However, I find that comparing texts of different lengths can also be an interesting way of looking at language.
Categories: UsingEnglish Content