Set up correct answers and grade them, both using a GAS script, in a descriptive test form.

  1. Introduction
    In a previous article on this blog, I introduced a spreadsheet that creates an auto-scoring descriptive test that can be used for English vocabulary learning.However, when I actually had the children use it, while the multiple-choice test could be smoothly prepared up to the answers,In the short-descriptive test that requires typing the answer with a keyboard, as of the current time, it was in a semi-automatic form where I had to manually copy and paste the answers to set them in the form.(October 15, 2023).

    If it’s a test with a small number of questions, like about 5, you can prepare the answers with the same ease as humming a tune, but as the number of questions increases to 20 or 30, preparing the answers becomes a hassle. Although you can’t embed the answers to the descriptive questions in the form from the GAS created in the spreadsheet, I found out that you can automatically grade the test by programming a script in the form and returning the results to the test taker via email, with the help of ChatGPT.

    Therefore, in this paper, I will attempt to automatically grade short-descriptive tests by using the form’s script to ⅰ) prepare the answers in the form and ⅱ) notify the grading results by email. Eventually, when the GAS code is expanded and the short-descriptive answers can be prepared from the spreadsheet using GAS, I believe it will be replaced with that. However, I hope you will read this article as a “bridge” until then.

The form also comes equipped with a script editor, allowing you to write scripts using GAS.

  1. Writing GAS scripts on the form side…
    First of all, anyone who writes scripts or program codes may have doubts like “Is it really impossible with the current GAS?” or “Maybe I just haven’t researched enough…”.In fact, there are paid add-ons that have the ability to embed correct answers in short-descriptive questions that require keyboard input. I searched the internet thinking there must be some way somewhere.

    In the process of searching for information, I couldn’t find any description that said “You can prepare answers directly with GAS”, but I was able to find a small post that said “This has been resolved with the expansion of the form’s API…”

    So, relying on the fact that not only can you write GAS scripts in the spreadsheet, but you can also write them in the form, I started to experiment with whether I could prepare the correct English words that I sent as hints to each short-descriptive question generated by GAS from the spreadsheet as answers to those questions, using scripts written on the form side.
  1. Getting advice from ChatGPT
    However, even after researching about GAS and API functions of Google Forms, I still couldn’t find a definitive method. I barely found an expression “CorrectAnswer” shown in JSON format in the Google FORM API, but I couldn’t do anything with just this information.In agony, I decided to ask ChatGPT for help.
  • Me: I’m creating questions for a form from a spreadsheet using GAS code. I want to add the hint string content2 in the following code as the correct answer. Can you tell me how to do this? (…Code omitted)
  • ChatGPT: Google Forms does not provide a feature to set a correct answer for text items. Text items are free-form questions, and their answers vary by user, so they cannot be automatically judged as correct or incorrect.
    • However, if what you’re looking for is a feature that judges whether the user’s answer to a text item matches the correct answer (content2) after they answer, and scores based on that, it is possible to implement this as a post-processing step using a Google Apps Script.
    • Specifically, you use the onFormSubmit event, which is triggered when a form response is submitted. In this event, you compare the user’s answer with content2, and if they match, you score it. Here is an example of how to do this. (…Omitted below)
  • I was indeed surprised by the response that it could be done in a different way to my question. The method suggested, such as informing students of the results by email, is slightly different from the feedback methods we have used in school so far, but I could see the possibility of making what was impossible possible.

3. Where should I hide the correct answer words that I sent as hints?

  • In the GAS created in the previous spreadsheet, short-descriptive questions were created where you answer from the keyboard, and the answer was sent as a hint for that question. Learners could copy and paste this hint and paste it into the form’s answer, allowing for automatic grading.Starting with this hint’s English word, I consulted with ChatGPT several times and was able to compare the string entered by the learner in the form and the string in the hint after sending it to determine whether it was correct or incorrect.However, the hint is also visible to the students. It won’t be a word test unless we somehow hide this. While collaborating with ChatGPT, searching for various hiding places for the correct answer, I came up with the idea of “What if we remember the word in the hint as the first answer to this form…

    The form’s test has a function to record all answers when repeated many times. If you use this function to send the hint string as the first answer and keep the correct answer for each question, you can judge whether it is correct or not by comparing it with the first answer from then on.Immediately, I asked ChatGPT for two scripts: “a script to answer the hint displayed for each question as the answer to that question and send it” and “a script to delete the hint after sending it”. Then, I combined the obtained scripts to create a script that uses the hint of the question as the answer and deletes the hint from each question after sending it.

    If you run this script for the first time in a short-answer form where the answer is displayed as a hint, it will be recorded as the first answer to this form and will be deleted from the hint. In this way, I was able to keep the correct word that was previously placed in the hint in the form, invisible to the students.
  1. How to grade when repeating word practice many times…
    Next, we will incorporate a new process that can compare the first answer recorded in the form with the input answer and determine whether it is correct or incorrect. Word tests are usually used only once when the teacher conducts them as test, but at my school, the English teacher hands out each spreadsheet that can generate a form test to the students and asks them to “use it for your own study”. It’s an initiative where you create your own tests and study in the unit of your own textbook.
    In this usage, students may repeat the same test many times, so we needed a process to determine the correctness according to this usage.
    So, I asked ChatGPT, “What would the script look like if I were to compare the first and last answers?”
  • ChatGPT: If you compare the first and last answers, the script would look like this. (Script omitted…)
  • This script grades by comparing the first and last answers and displays the results in the log at the time of GAS execution.

    After that, I asked ChatGPT several more times to improve the script, and finished it as a script that grades by comparing the input value with the first input (correct answer). The entire script is posted in the attached document.
  1. Send the test results by email
    With the improvements so far, the results of the grading are now displayed as execution logs when the script is run. However, not all children are good at using ICT equipment, so I made some improvements to make it easier to use by sending the results of the grading by email.
    If the answers to each question in the form can be prepared by copying and pasting, there would be no problem, but this time, I was grading in a slightly different form, the log at the time of GAS execution, so I made it possible to read the results of the grading by email. This greatly improved the usability compared to running it in the script editor.

    I also learned from ChatGPT how to set up scripts to notify grading results by email and triggers to activate email sending scripts when the “Send” button is pressed, and completed a script to automatically grade English words and send the results.
  1. Finally
    The idea of programmers asking ChatGPT for program code has been a topic on the web for some time, but actually using it in this project, I was amazed at its usefulness.
    I myself, who had not studied GAS scripts very much, felt like I was consulting with a very reliable friend who is good at GAS programming, when I asked ChatGPT, “I want this kind of processing!” and it immediately assembled the code and responded to my modifications many times.

    In the case of a short-descriptive question of 20 English words to be answered by keyboard, the time it took to prepare was about 4 minutes when the correct answer was prepared by cutting and pasting from the hint, and about 2 minutes when the answer was prepared from the script pasted into the form.

    From these numbers, it can be seen that cutting and pasting is faster up to about 10 questions, and if there are more than that, it is more time-efficient to set the answers with the script we created this time.
  • Cutting and pasting is effective in cases where you want to partially display hints, such as changing ‘family’ to ‘fa****’. On the other hand, if you are creating a test of 20 or more questions many times, preparing the correct answers quickly with a script can lead to efficient study. I thought it would be good for the time being to choose and use the method of preparing answers according to the situation.

    Finally, I will also mention the cost of using ChatGPT. In this development, I used ChatGPT from the API and used version 4.0. When using it from the API, the way tokens are counted for billing varies depending on the version of ChatGPT you use and whether you inquire in Japanese or English.This time, the number of inquiries I made in Japanese to create a script about 50 times, and the cost was $4.48.
    I’m truly amazed at the excellence of ChatGPT.

    Former Principal of Hakodate Kita Junior High School, Toshiyuki Okuzaki

Appendix: The GAS code created in this article can be downloaded. Please use it after confirming the disclaimer.


投稿日

カテゴリー:

,

投稿者:

コメント

コメントを残す

メールアドレスが公開されることはありません。 が付いている欄は必須項目です