AI content generation is gaining traction among digital marketers but some people are still skeptical of the content machines produce.
This is especially relevant when it comes to rewriting articles because high-quality rewrites are an extremely efficient way to amplify your content strategy, but bad rewrites are useless, even harmful.
The main concern is that rewrites created by machines are noticeably lower quality than those created by humans which would make bad impressions on readers.
But with the advancements in deep learning WordAi has been making, we don’t think readers can tell how the content was created.
So to validate this assumption, we ran a series of tests with third-party readers to understand if people could really tell that WordAi content was written by a machine.
Our tests set out to answer 3 important questions:
For each test, we hired 300 people to read different paragraphs and answer questions about the content they read. Throughout the tests, we used 3 different types of paragraphs:
- Original paragraphs were written by freelance writers who were paid an average of $2.20 per 100 word paragraph.
- WordAi-rewritten paragraphs were generated from the original paragraphs using WordAi Version 5 default settings. WordAi charges a flat fee for up to 3,000,000 rewritten words per month.
- Human-rewritten paragraphs were rewritten from the original paragraphs by freelance writers who were paid an average of $2.20 per rewritten paragraph.
When measuring turnaround time, WordAi took less than a second to return each paragraph rewrite while the freelancers took at least 24 hours to return the original paragraphs and rewritten paragraphs.
1: Can readers tell that WordAi content was written by a machine?
Before we can add in the element of rewriting, we needed to know if readers can tell that WordAi’s content was written by a machine in general.
So, our first test gave readers a single paragraph at a time and asked directly if the paragraph was written by a human or a machine.
To prevent bias in this test, we included a control group of original paragraphs that were written by freelancers in addition to the paragraphs rewritten by WordAi.
We found that when looking at both the control paragraphs and the paragraphs rewritten by WordAi, readers could not reliably if they were written by a machine or a human.
It is important to note that because we directly asked readers if they thought the content was written by a machine, they were more likely to think that any paragraph was written by a machine (as seen by the high percentage of “Written by a Machine” answers in the control group of paragraphs written by human freelancers).
However, in daily life, if a reader came across a piece of content paraphrased by WordAi, they likely wouldn’t even question whether or not it was written by a human.
2: Can readers tell that WordAi content was rewritten by a machine when compared to paragraphs rewritten by actual human writers?
Once we determined that readers cannot tell that WordAi content was machine-written in general, we added in the concept of rewriting. This gave the readers more context when evaluating the content so it should have been easier for them to pick out the machine-written content if it was lower quality.
To test this, we gave the readers an original paragraph and two paraphrased versions (one rewritten by WordAi and one rewritten by a freelancer). We then asked the readers which version was created by a machine and which was created by a human.
In total, 65% of people could not tell which paragraph was rewritten by a machine.
Realistically, this means that people are no more likely than chance to guess which paraphrased paragraph was created by Wordai and which was created by a human.
3: When comparing a piece of content to WordAi’s rewrite, can readers tell which is the original?
While it is interesting to know that readers cannot accurately tell WordAi’s rewritten content apart from content rewritten by humans, the real goal is for no one to know that the content was rewritten at all. So, our final question was designed to test this directly.
In our final test, readers were shown an original paragraph and WordAi’s paraphrased version and were asked to pick which paragraph was the rewrite.
More than half of all readers thought the original paragraph was the rewrite or admitted to not knowing. In fact, the readers were almost evenly divided between which paragraph they thought was the rewrite. Therefore, the readers could not accurately detect which paragraph was the original and which was paraphrased.
When using text rewriting as a content strategy, it is very important for the paraphrased text to maintain the same level of quality and usefulness as the original content. Arguably, the best way to determine if a rewrite is as useful as the original is if a reader cannot tell which piece of content came from the other.
1: Readers could not tell that WordAi’s content was written by a machine.
2: Readers could not tell that WordAi’s paraphrases were written by a machine when compared to paraphrases written by actual human writers.
3: When given an original paragraph and WordAi’s rewrite, readers could not tell which was which.
4: Given that readers could not tell the rewrites from the original paragraphs, WordAi provides the same level of quality as a human writer.
5: WordAi’s rewrites cost significantly less than freelancers’: paragraphs purchased by freelancers cost $2.20 per 100 words compared to WordAi’s flat rate for up to 3,000,000 words per month.
The end goal is for readers to never think twice about how the content they’re reading was created because what matters is that the content is engaging and useful to the reader.
When WordAi rewrites content, it maintains the overall meaning and value of the original text while changing everything else. So you can rest easy knowing that WordAi’s unique rewrites will be just as high quality and valuable as the original.
The post Can WordAi’s AI rewriter tool pass human evaluation? appeared first on WordAi Blog.