cs182project2
Science Score: 44.0%
This score indicates how likely this project is to be science-related based on various indicators:
-
✓CITATION.cff file
Found CITATION.cff file -
✓codemeta.json file
Found codemeta.json file -
✓.zenodo.json file
Found .zenodo.json file -
○DOI references
-
○Academic publication links
-
○Academic email domains
-
○Institutional organization owner
-
○JOSS paper metadata
-
○Scientific vocabulary similarity
Low similarity (8.4%) to scientific vocabulary
Repository
Basic Info
- Host: GitHub
- Owner: bdangs
- Language: Python
- Default Branch: main
- Size: 18.5 MB
Statistics
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
- Releases: 0
Metadata Files
README.md
CS182Project2
Team Members: Bryan Dang - bdang014, Christian Maristela - cmari038, Alan Xu-Zhang - axuz001
Option 2: AI-based Input Generation
Pre-trained large language models (LLMs) have recently emerged as a breakthrough technique,
and many researchers have been exploring how LLMs can be applied to software testing for
generating structured data, such as XML files.
For this option, you will use the pre-trained GPT2 as the input generator to generate XML files
and check if the generated files are syntactically valid. Specifically, you will need to:
1. Prepare a valid XML file as the example input for GPT2.
a. Here are some examples for your reference: example 1, example 2.
b. Please generate another valid XML file according to XML syntax rules.
2. Write a program to use libxml2 to parse XML files and to check if they are syntactically
valid.
a. Here are some examples for your reference.
3. Design at least five prompt templates to query the GPT2 model.
a. For example, you can prompt GPT2 in this way: Here is an example XML file.
Please generate another one.
Owner
- Name: Bryan Dang
- Login: bdangs
- Kind: user
- Location: Riverside
- Company: University of California, Riverside
- Website: bdangs.github.io
- Repositories: 1
- Profile: https://github.com/bdangs
Just a college student :P
Citation (CITATION.cff)
@misc {hf_canonical_model_maintainers_2022,
author = { {HF Canonical Model Maintainers} },
title = { gpt2 (Revision 909a290) },
year = 2022,
url = { https://huggingface.co/gpt2 },
doi = { 10.57967/hf/0039 },
publisher = { Hugging Face }
}