Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Multiple Choice type questions #21

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

Aditya062003
Copy link
Contributor

Feature: Multiple Choice Question Generation

Description:

Resolves Issue #20
Introduce a new feature to enable the generation of multiple choice questions (MCQs) within the software. This feature utilizes a pre-trained model for distractor generation, trained on the RACE dataset. Additionally, it incorporates the ConceptNet API to expand the pool of distractors by providing similar words to the answer based on the context.

Implementation Details:

  • Integrated a pre-trained model for distractor generation trained on the RACE dataset.
  • Utilized the ConceptNet API to fetch similar words to the answer based on the provided context.
  • Enabled the software to generate MCQs by leveraging the generated distractors and similar words from the API.
  • Attached a video demonstration showcasing the working of the feature.

Changes Made:

  • Added functionality for MCQ generation.
  • Integrated the pre-trained distractor generation model.
  • Implemented API calls to ConceptNet for obtaining additional distractors based on context.
  • Users can now download the model and the required Python script from here and place them in the models folder under the modelC folder. The models folder structure should look like this:
+---models
ª   +---modelA
ª          config.json
ª          generation_config.json
ª          pytorch_model.bin
ª   
ª   +---modelB
ª           config.json
ª           generation_config.json
ª           pytorch_model.bin
ª   
ª   +---modelC
ª           modelC.ckpt
ª           distractor_generator.py

Steps to Reproduce:

  1. Provide context and correct answer.
  2. Utilize the software's MCQ generation feature.
  3. Verify the generated MCQ options and their relevance to the context and answer.

Expected Behavior:

  • The software should accurately generate multiple choice questions based on the provided context and answer.
  • Distractors should be relevant and diverse, enhancing the quality of the MCQs.
  • MCQ options should be logically structured and formatted.

Testing:

  • Comprehensive testing was done to ensure the accuracy and reliability of MCQ generation.
  • Test cases were covered for various contexts, answers, and edge cases.

Video Demonstration:

[Link to the video demonstrating the working of the feature]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant