Applications

Since finishing my Master’s Thesis I have been applying to a lot of jobs, with positions ranging from substitute teacher to senior adviser. Between the 25th of May, 2015, and the 14th of November, 2016, the count is somewhere around 200 in total. What follows is an analysis of a subset of these, 108 to be exact, and what might be learned from a retrospective on these.

How I write applications #

Like anyone, I try to summarize my qualifications and motivations for the specific position in a succint and relatively short manner. Thus, no application is longer than a page in length, and it always includes my contact details, the date and place, a title, the contact details of the employer, and two to three paragraphs of text.

Along with this, of course, I include my CV and relevant documents such as my Master’s Diploma and letters of attestation. More importantly, the language of these applications are definately written in my idiolect and more or less to the same level that I would write an e-mail to a professor or a paper. As I will discuss, this may be an unfortunate given the recipient of these applications.

Descriptive Data #

To start, let’s consider the development of the length of the applications. I measure this by the amount of words, sentences, and lines in each application - which is of little interest by itself, but it clearly shows a trend of increased length. In the following chart they are grouped by day and a mean taken of the respective measures.

Figure 1. Descriptive Statistics #

Descriptive Statistics

Notice that a good portion of the applications are of “full” length, that is, 200+ words which tend to signify three paragraphics – filling an A4 page. This is often due to the inclusion of details related specifically to the job posting, such as “Experience with Microsoft Office” or “Statistical Analysis” being listed under desired qualitifications. The shorter applications tend to only include the “basic” information such as my education and experience, which is mentioned in virtually every application.

Analytical Data #

To get a clearer perspective on how I write, I ran a series of statistical tests which provide an indicator of the “readability” and complexity of each application. I discuss the various statistics below, but these are the individual results:

Figure 2. Text Statistics #

Text Statistics

The statistics are grouped in histograms, with a distribution-plot overlaid to give simple overview of the linguistic level of most of the applications. Here’s an overview of what is measured:

  • Flesch Reading Ease: Indicates the difficulty of reading the text, where 0 is Very Difficult and 100 Very Easy [1].
  • Flesch-Kincaid Grade Level: Approximates the grade level needed to read the text, based on the US education system. Thus, grade 6-8 would be middle school whilst, and 9-12 high school [2].
  • Gunning FOG Index: Measures the amount of “fog” in a piece of text, that is, unnecessary complexity [3]. It can best be described by the following:

The ideal score for readability with the Fog index is 7 or 8. Anything above 12 is too hard for most people to read. For instance, The Bible, Shakespeare and Mark Twain have Fog Indexes of around 6. The leading magazines, like Time, Newsweek, and the Wall Street Journal average around 11 [3:1].

  • SMOG Index: Alternatively known as the Simple Measure of Gobbledygook, which also indicates a grade level [4].
  • Automated Readability Index: Derived from word and sentence difficulty, that is, numbers of letters per word and number of words per sentence, respectively and with sensitivity to syllables. Also yields a grade level [5].
  • Coleman-Liau Index: Like the Automated Readability Index, but insensitive to syllables, and also yields a grade level [6].
  • Linsear Write Formula: Developed by the U.S. Air Force to determine grade level based on sentence length and the number of words with three or more syllables [7].
  • Dale-Chall Readability Score: Counts “hard” words – that is, words not commonly familiar to 4th-grade students – and sentence length. Indicates a required grade level [8].

Notably, most of these measures are targeted at a necessary grade level – hence years of education – needed to understand the text. If grouped together these statistics yield a single indicative measure of the needed grade level to understand each application, and thus the development of this readability can be shown.

Figure 3. Readability Consensus #

Readability Consensus

The most striking thing about this is that most of the applications are written in a language seemingly only understandable by college students. More worryingly, the majority of them would be hard to understand by someone not enrolled in college. Some are even approximated to be linguistically difficult for readers without a graduate degree. The ostensible lack of readability reveals a potential reason for why the applications were unsuccessful. However, this trend has declined since the first time I ran this analysis: Many applications are now more succinct and easier to read.

Qualitative Data #

If we turn to look at specific applications, we can draw a couple of samples that exemplify these results. Consider the application for the position as Adviser at the Norwegian Barents Secretariat in Murmansk:

With my administrative experience in the preparation of the structure and content of seminars in various courses, and critiquing academic papers on various levels, I am confident that I can proficiently execute the work required as an adviser to the Murmansk Office.

This is the longest sentence in the application, and though it only contains two commas it packs a lot of information within its 42 words and 266 characters. Clearly, some words could be dropped to achieve more clarity and concision. Further, the application is approximated at the 17th and 18th grade, and it was admittedly written with the assumption of a highly educated audience. However, as discussed below, this may be ineffective.

Another example is the application to the position as a Research Assistant at RAND:

With my experience in the preparation of the structure and content of seminars in various courses, as well as my time as the representative to the educational committee, I am confident that I can constructively and proficiently contribute to the work that RAND Education performs.

Roughly the same length as the previous example, it is however constructed in a more readable manner. Written for the same audience, it has the benefit of sparsity in presentation rather than futile compression of information into each sentence. This is indicated by the required comprehension being rated to the 14th and 15th grade.

A more recent example from an application to the position as a Frontend Developer at Ramsalt Labs:

In previous roles as a Seminar Instructor, Research Assistant, and Master’s Representative, cooperation with colleagues were imperative - and I believe my professional attitude and interest in this line of work can contribute positively to the work done at the Ramsalt Lab.

Though similar in content and with the same objective as the previous two examples, it conveys a more fluent and constructive message directed at cooperation and interest.

Implications #

Obviously the discussed metrics do not mean that further applications should be “dumbed down” or greatly simplified for the purpose of being more accessible. After all, these are not a biblical texts. However, the aforementioned trends provide a guide for what to avoid: Overly complex sentence-structures and information-packed paragraphs, that clearly will not be read by anyone after a first glance.

This is not an unwarranted worry from the pessimistic view that hiring managers, recruiters, or other staff only glance at a resume or cover letter, but substantiated by research. When reading a resume, recruiters spend 6 to 11 seconds reading it [9] [10] [11], which by any standard is a short amount of time to comprehend information spanning 1-3 pages. More prominent is a study by the Society for Human Resource Management which found that:

It takes most HR professionals less than five minutes to determine whether a job candidate will proceed to the next step of the selection process [12].

In the case of cover letters, or applications as I refer to them:

83% of survey respondents reported that they spent one minute – or less – in reading an applicant’s cover letter. 15% stated they would spend more than a minute. […] The time grows shorter as the firm grows larger. 52% of companies with staffs of 250 or more will spend, at most, 30 seconds reviewing a cover letter [13].

This is not surprising given that the reading speed of an average adult is 300 words per minute [14]. Consider the distribution in the bottom plot of the following chart:

Figure 4. Reading Times #

Time to read

As seen, the length of the applications are grouped into three categories: Long, medium, and short. Importantly, considering the expected time to read the applications – shown in the top plot of the chart – we can see that within these categories the applications conform to the aforementioned finding. All applications are readable in less than a minute by the average adult.

Further, this plot shows the expected reading times for the mean length of these categories: 150 words per minute for 3rd Grade students, 250 for 8th Grade students, 300 for an average adult, 350 for 11th Grade students, 450 for an average college student, 575 for high level executives, 675 for college professors, and 800 for high-scoring college students [14:1] [15].

Hopefully, hiring managers and recruiters are educated to at least college-level, and capable of reading at least as fast as the average adult. If this can be assumed to be correct, they are clearly capable of comprehending all of the applications – even with linguistic complexity – in less than a minute. For reference, my master’s thesis would take 40 to 182 minutes to read with the aforementioned categories, and my CV 19 to 103 seconds. All of this, of course, assumes some mixture of skimming through the texts and returning to prominent points of substance.

Conclusion #

Though this analysis is limited in scope and relatively short, some prominent trends have surfaced: The applications are on occasion unnecessarily complex in linguistic construction – especially in terms of phrasing and sentence-structure – and hence the language can still simplified for many of them to be understood by whomever reads it. Additionally, most of the applications are relatively long, despite the estimated reading time, and so sparsity should be favored.

Further, there are obviously steps that should be taken in communicating skills and interest to the relevant position rather than personal qualifications. In the same vein, the majority of the jobs I have applied to implicitly ask for more experience than I have at this point. Some explicitly do, with the proviso that qualifications can outweigh experience or vice versa. In any regard, this analysis has shown some structural issues which should be addressed in future applications.


Appendix 1: Table with overview of Applications
Appendix 2: Jupyter/IPython Notebooks

Sources #


  1. The Flesch Reading Ease Formula ↩︎

  2. The Flesch-Kincaid Grave Level Formula ↩︎

  3. Gunning’s Fog Index (or FOG) Readability Formula ↩︎ ↩︎

  4. The SMOG Readability Formula ↩︎

  5. The Automated Readability Index ↩︎

  6. The Coleman-Liau Index ↩︎

  7. The Linsear Write Readability Formula ↩︎

  8. The New Dale-Chall Readability Formula ↩︎

  9. Business Insider 2012: “What Recruiters Look At During The 6 Seconds They Spend On Your Resume” ↩︎

  10. Forbes 2012: “What Your Resume Is Up Against” ↩︎

  11. Thinkopolis 2014: “‘Time’ to Work” ↩︎

  12. SHRM Survey Findings: Résumés, Cover Letters and Interviews (2014) ↩︎

  13. ResumeEdge 2013: “Hiring Managers Speak Out On Cover Letters” ↩︎

  14. Free Speed Reading 2009: “What is the Average Reading Speed of Americans?” ↩︎ ↩︎

  15. Forbes 2012: “Do You Read Fast Enough To Be Successful?” ↩︎