From the start, WeScreenplay has been devoted to approaching script coverage and screenplay contests like a Silicon Valley startup. Besides automating away all of the tasks that other companies have to do manually, we've been building models out of the troves of script data that pass through our system every day.
Today, WeScreenplay is proud to finally unveil some of that data to its customers to lend insight into the strengths and weaknesses of their scripts. You'll notice a new "Percentile" column in your dashboard, representing where your script stands among all of other scripts scored by that same analyst(s). You'll also be able to see a percentile breakdown for each category, like plot, dialogue, and structure, on the last page of every coverage you purchase through us. Prior to this, it was really a guessing game what those scores meant, and impossible to compare scores from reader to reader, since some readers naturally tend to be tougher on scripts than others. Hopefully, this will provide you with a better understanding of those scores and help you track your progress along the way.
You'll notice a new "Percentile" column in your dashboard, representing where your script stands among all of other scripts scored by that same analyst(s).
Scores are normalized across readers by figuring out the standard deviation and average score for a given reader for a given scoring category, and then shifting the raw scores to our internally used, normalized scores. Then, we take the z-score of the normalized score and calculate the percentile. In a nutshell, this means that if a reader tends to average a 4, for, say, dialogue, and they give your script a 7 for dialogue, then your normalized score for dialogue will likely recieve a one or two point boost.
Because normalizing scores allows us to compare scores from one reader to another, it greatly simplifies the process of choosing which scripts to advance to further rounds of our contests and makes them a whole lot more fair than other contests.
Because normalizing scores allows us to compare scores from one reader to another, it greatly simplifies the process of choosing which scripts to advance to further rounds of our contests and makes them a whole lot more fair than other contests. While we've already been doing this for awhile, hopefully the increased transparency for contest entrants will make them feel more confident that their submission is getting a fair shake.
By the way, just for fun, we took a look at the data to see what genres stood out in terms of their scores. Interestingly, "Adventure" scripts received the highest overall average percentile, (61st percentile), as opposed to scripts labeled as "Other" (which presumably means the writer was unable to place them in a genre), which averaged out to the 42nd percentile! Comedy scripts were second lowest, at just the 45th percentile. By far the most common genre of scripts submitted has been Drama, followed by Comedy, with Animated screenplays coming in last.
The strongest scoring category has been "concept", while the lowest has been "plot", which intuitively, makes sense. We would expect that a lot of scripts coming through aren't always fully polished, so while the concept might be there, sometimes the plot structure needs a little work.
As always, if you have any questions, comments, or suggestions, don't hesitate to reach out to firstname.lastname@example.org. We love feedback! Happy writing…