As I prepared for the IELTS exam, I encountered several challenges while preparing for the writing portion of the test. The main issues were the lack of fast feedback and the need to use many tools to check your essays. I wanted to solve these problems by creating a web application that would provide faster and cheaper feedback and help people prepare for the exam using fewer tools.
Checking essays with a teacher is costly and time-consuming. The exam prepartion is expensive, and the cost of checking essays with a teacher is one of the biggest expenses.
The writing part of the exam is a tedious process. It consists of finding questions, writing essays, checking grammar and spelling, using a timer and word count, and using a thesaurus.
My research into the problem included conducting interviews, collecting feedback from teachers, analyzing essays, and reviewing online resources to identify helpful tools and strategies for effective essay writing.
Based on this research, I decided that I would focus on solving the following problems first:
- Ensuring that grammar checker is tailored to the exam;
- Replacing self-checklists with programmatic checks;
- Providing faster feedback on essays (including approximate score);
- Providing some guidance for people who are not familiar with the exam.
After that, I wrote down requirements and constraints.
Then I moved to exploring the overall design and core components: navigation, cards, layout, colors and interactions.
I researched similar apps such as online and offline text editors, grammar checkers, online essay checkers, Dribbble, book readers, and apps that allowed leaving comments anywhere.
Thinking about requirements and constraints in advance significantly reduced the number of design decisions I needed to make during the design process.
For instance, decision such as what color to use for the sidebar was easy to make because it followed that the sidebar should be light-gray to improve readability on low-resolution monitors.
As I worked on the design, I explored a variety of options for different elements of the interface. This was an ongoing process, as I constantly evaluated and refined my choices based on their effectiveness and overall aesthetic. Through this process, I arrived at a final design that met my requirements.
For a moment, let's consider the main screen of the app — the editor. It consists of a text area for the essay and a sidebar. Let's take a look at the sidebar.
In the sidebar, there is a list of check cards, which were initially shown on a white background. However, after testing this design on a low-resolution monitor, I found that the cards were difficult to see on a white background, and changed the background color to light gray ➀.
In the collapsed form, cards only show a header and one line of description ➁. This provides more space for other cards and at the same time, the meaning remains clear. Each card is color coded to reflect the exam scoring system ➂. Each card can be clicked to expand and show the check results, with the option to include a link to a more detailed explanation of the check ➃.
One aspect of the UX design that I particularly liked was the way that underlined text interacted with the corresponding check card. Clicking on an underlined check scrolls the sidebar to the related card, and clicking on a check card scrolls the text to the card. This feature helps users easily connect checks with the corresponding text.
Another key feature of the app is the ability to apply suggestions without manually copying them. For instance, when a grammar check produces a suggestion, users can simply click a button to apply it, making the correction process more efficient. This feature enhances the user experience by making it easier and faster to fix mistakes.
Besides the editor, I also designed other app screens, including the sign-in, sign-up, password reset, and dashboard screens, landing page as well as all the transactional emails that the app sends.
I conducted a few more interviews with people to get feedback on the app. People said that the app replaced multiple other tools for them, was clear and easy to use. The main concern was about the reliability of the calculated score. It felt too "mechanic". Unfortunately, it was the feature that I thought was worth charging for because it saved people time and money. I removed it until I can come up with a more effective solution.
Then, I thought that might leverege LLM for essay scoring. I built a Telegram bot that provided text feedback on essays to validate this idea. The scoring was still not accurate enough, and the API was quite expensive (12 cents per call for a fine-tuned model). Additionally, the training data was very limited, so it's possible that the exam organizers will implement their own AI-based scoring system in the future, rendering my app obsolete.
Considering all these factors, I decided to stop working on this app.