RoostGPT
Login at https://app.roost.ai to access GPT feature. Offerings: RoostGPT UI: - Triggers from local workspace: No - GitOps(GitHub, GitLab, Bitbucket): Yes - Notifications(Slack/Teams): Yes RoostGPT CLI: - Triggers from local workspace: Yes - GitOps(GitHub, GitLab, Bitbucket): Yes - Notifications(Slack/Teams): No RoostGPT VS Code Extension: - Triggers from local workspace: Yes - GitOps(GitHub, GitLab, Bitbucket): No - Notifications(Slack/Teams): No
CLI
Installation Command
RoostGPT binary is available at https://github.com/roost-io/roost-support/releases
The RoostGPT binary allows for unmatched flexibility during test generation, by allowing you to trigger test generation both locally and through git through the use of one simple .env file.
Add the "roostgpt-<linux|macos|win.exe>" binary to your PATH environment variable
For Linux/MacOS
, you can execute the below command to install roostgpt
curl https://raw.githubusercontent.com/roost-io/roost-support/master/roostgpt.sh | bash -
CLI Commands
./roostGPT-macos
(for mac) & ./roostGPT-win.exe
(for windows)
This command is used to run the RoostGPT CLI on your operating system.
-
-h, --help
: Displays the help message that provides information about the available options and usage. roostgpt version update
: Update roostgpt versionroostgpt config create
: Creates roostgpt configuration file. The configuration file contains environment variables used by RoostGPT application.--ui
: It opens the web form in order to create downloadable configuration file.--output-dir, -o <path>
: Creates a configuration file in a specific directory. The configuration file contains environment variables used by the RoostGPT application. Default path is pwd.--name <name>
: Creates the configuration file with specific name. Default name is default.env
-
roostgpt config update
: Updates the roostgpt configuration file.-
--config, -c <path>
: Specifies the location of the environment configuration file. The configuration file contains environment variables used by the RoostGPT application. -
--name <name>
: loads pwd/$name.env. It is ignored if the --file is provided
-
-
roostgpt test create
: Triggers the test generation.--config,-c <path>
: Path to the configuration file if provided, otherwise the configuration will retrieved from Environment variables .
roostgpt test improve
: Improve the selected test based on user feedback.roostgpt test retrigger
: Re-trigger the chosen test from where it left off during its last execution.
Instructions for creating .env file
An easy way to download the .env file is to use https://app.roost.ai/gptCLIForm
The .env file should be in the same folder path as RoostGPT binary.
Here is what the content of your .env file will look like:
# GIT env vars
GIT_TYPE= # Default is github; Supports [github, gitlab, azure, bitbucket, local]
HOSTED_TYPE= # Default is cloud; Supports [cloud, hosted]
GIT_HOSTED_URL= # Required if HOSTED_TYPE=hosted
USE_SSH= # Optional; Default is false; Supports [true, false]; SSH based auth for Git
LOCAL_PROJECT_PATH= # Required if GIT_TYPE is local, Path to your workspace.
SOURCE_GIT_CLONE_URL= # Optional; the source repo clone URL
SOURCE_GIT_TOKEN= # Required
SOURCE_OWNER_NAME= # Required
SOURCE_REPO_NAME= # Required
SOURCE_REPO_ID= # Required for GIT_TYPE=gitlab
SOURCE_PROJECT_NAME= # Required for GIT_TYPE=azure
SOURCE_PROJECT_ID= # Required for GIT_TYPE=bitbucket
SOURCE_BASE_BRANCH= # Required
SOURCE_RELATIVE_DIRECTORY= # Optional; the relative path of the repository
SAME_TARGET_DETAIL= # Optional; Default true, Supports [true, false]
DESTINATION_GIT_CLONE_URL= # Optional; the destination repo clone URL
DESTINATION_GIT_TOKEN= # Required if SAME_TARGET_DETAIL=false
DESTINATION_OWNER_NAME= # Required if SAME_TARGET_DETAIL=false
DESTINATION_REPO_NAME= # Required if SAME_TARGET_DETAIL=false
DESTINATION_REPO_ID= # Required if SAME_TARGET_DETAIL=false and GIT_TYPE=gitlab
DESTINATION_PROJECT_NAME= # Required for GIT_TYPE=azure
DESTINATION_PROJECT_ID= # Required if SAME_TARGET_DETAIL=false and GIT_TYPE=bitbucket
DESTINATION_BASE_BRANCH= # Required if SAME_TARGET_DETAIL=false
# Open AI env vars
OPENAI_API_MODEL= # Optional; Default is gpt-4
OPENAI_API_KEY= # Required if AI_TYPE=openai
# Vertex AI env vars
VERTEX_FINE_TUNE= # Optional; Default is false, Supports [true, false]
VERTEX_PROJECT_ID= # Required if AI_TYPE=vertexai
VERTEX_REGION= # Required if AI_TYPE=vertexai and VERTEX_FINE_TUNE=true
VERTEX_BEARER_TOKEN= # Required if AI_TYPE=vertexai
VERTEX_MODEL= # Required if AI_TYPE=vertexai; Supports [text-bison, code-bison, codechat-bison]
# Azure Open AI env vars
AZURE_OPENAI_ENDPOINT= # Required if AI_TYPE=azure_open_ai
AZURE_DEPLOYMENT_NAME= # Required if AI_TYPE=azure_open_ai
AZURE_OPENAI_KEY= # Required if AI_TYPE=azure_open_ai
AZURE_OPENAI_VERSION= # Optional Default is 2023-12-01-preview
# Open source env vars
OPEN_SOURCE_MODEL_ENDPOINT= # Required if AI_TYPE=open_source_ai
OPEN_SOURCE_MODEL= # Optional; Supports [meta-llama/Llama-2-13b-chat, HuggingFaceH4/starchat-beta]
# Sagemaker model env vars
SAGEMAKER_MODEL_ENDPOINT= # Required if AI_TYPE=sagemake_model
# Claude AI env vars
CLAUDE_AI_MODEL= # Required if AI_TYPE=claude_ai
CLAUDE_AI_API_KEY= # Required if AI_TYPE=claude_ai
# Advanced AI env vars
AI_TEMPERATURE= # Optional; Default value is 0.6
# Jira board env vars
JIRA_HOST_NAME= # Required if TEST_TYPE=functional and BOARD_TYPE=jira
JIRA_EMAIL= # Required if TEST_TYPE=functional and BOARD_TYPE=jira
JIRA_TOKEN= # Required if TEST_TYPE=functional and BOARD_TYPE=jira
JIRA_ID= # Optional if TEST_TYPE=functional and BOARD_TYPE=jira
# Azure board env vars
AZURE_ORG= # Required if TEST_TYPE=functional and BOARD_TYPE=azure
AZURE_PROJECT= # Required if TEST_TYPE=functional and BOARD_TYPE=azure
AZURE_TOKEN= # Required if TEST_TYPE=functional and BOARD_TYPE=azure
AZURE_WORK_ITEM_ID= # Optional if TEST_TYPE=functional and BOARD_TYPE=azure
# Log env vars
LOG_SOURCE= # Optional; Default is elks
LOG_SOURCE_PATH= # Optional; the log file path
LOG_ELASTICSEARCH_URL= # Optional; the Elastic search URL
LOG_ELASTICSEARCH_USER= # Optional; the Elastic search user name
LOG_ELASTICSEARCH_TOKEN= # Optional; the Elastic search token
LOG_ELASTICSEARCH_API_KEY= # Optional; the Elastic search api key
# Behavioural test cases env vars
BEHAVIORAL_TEST_TYPE= # Optional; Supports [gherkin]
BEHAVIORAL_TEST_SOURCE= # Optional; Supports [file, gitpath, url]
BEHAVIORAL_TEST_FILE_PATH= # Optional; path of the source file if source is file/gitpath. Relative path in case of gitpath.
BEHAVIORAL_TEST_URL= # Optional; URL of the source file if source is url
# API Spec env vars
API_SPEC_TYPE= # Optional; Supports [swagger, postman]
API_SPEC_SOURCE= # Optional; Supports [file, gitpath, url]
API_SPEC_FILE_PATH= # Optional; path of the source file if source is file/gitpath. Relative path in case of gitpath.
API_SPEC_URL= # Optional; URL of the source file if source is url
# API test env vars
HTTP_VERBS_FOR_TESTING= # Optional; Default is "get,post,put,patch,delete".The specific http methods are to be tested for all APIs. Enter "," separated values here.
REGEX_HTTP_ENDPOINTS_FOR_TESTING= #Optional; Regex string that matches specific endpoints that need to be tested. If empty, all the endpoints will be tested.
# License env vars
ROOST_DOMAIN= # Optional; Default is app.roost.ai
ROOST_TOKEN= # Required, the Roost token
TELEMETRY= # Optional; Default is true, Supports [true, false], send telemetry data to roost, no private information is shared.
# Additional vars
TEST_NAME= # Optional; Default is roost-test
ROOST_DIR= # Optional; Default is /var/tmp/Roost/RoostGPT
LANGUAGE= # Optional; Default is java; Supports [java, go, python, csharp, nodejs]
AI_TYPE= # Optional; Default is openai; Supports [openai, vertexai, azure_open_ai, open_source_ai, sagemaker_model, claude_ai]
PACKAGES_TO_SCAN= # Required for LANGUAGE=java, the package to scan and generate test for example - com.demo.sample
ITERATION= # Optional; Default is 1
TIMEOUT= # Optional; Default is 1 hour
TEST_TYPE= # Optional; Default is unit Supports [unit, functional, api-spec-test, integration]
RETRIGGER= # Optional; Default is false, Supports [true, false]
TRIGGER_ID= # Unique id to identify multiple triggers; Default is epoch
BOARD_TYPE= # Optional; Default is jira; Supports [jira, azure, none]
GIT_PR_URL= # The Git PR URL of the generated test
MAX_DEPTH= # Optional; Default is -1; if MAX_DEPTH is -1, it will traverse all the sub directories else the maximum depth directories to look for
TEST_FRAMEWORK= # Optional; Default is "pytest" for Python, "gotest" for Golang, "JUnit4" for Java, "nunit" for CSharp, "jest" for Node.js, "postman" for Postman test-script. Supports [pytest, unittest] for Python, [gotest] for Golang, [JUnit4, JUnit5] for Java, [nunit] for Csharp, [jest, mocha] for Node.js, [postman, artillery, rest-assured, karate] for [test_type: api-spec-test] or [test_type: integration]
FUNCTIONS_TO_TEST= # Optional; list of function names to be tested, separated by comma. Name to be specified as module.[class.]method OR module.function
USE_ASSISTANT= # Optional: Use Assistant feature in openai. Default:false
# Improve test env vars
IMPROVE_TEST= # Optional; Default is false, Supports [true, false]
FILE_PATH= # Required if IMPROVE_TEST=true
USER_CONTENT= # Required if IMPROVE_TEST=true
TESTSCRIPT_ENDPOINT= # Required if IMPROVE_TEST=true and TEST_FRAMEWORK=postman
Here's a description of each variable:
# Git env vars
GIT_TYPE
: Specifies the type of Git repository. Default value is "github"
Supported values: "github", "gitlab", "azure", "bitbucket". (Optional)HOSTED_TYPE
: Where the git platform is accessible. Default is "cloud", supports "cloud" and "hosted". (Optional)GIT_HOSTED_URL
: URL of the hosted git platform. (Required if HOSTED_TYPE="hosted")USE_SSH
: SSH based auth for Git. Default is "false". Supports "true", "false". (Optional)LOCAL_PROJECT_PATH
: Path to your workspace (Required if GIT_TYPE is "local")SOURCE_GIT_TOKEN
: A token for authenticating the specific source Git repository. (Required)SOURCE_GIT_CLONE_URL
: the source repo clone URL (Optional)SOURCE_OWNER_NAME
: The name of the source repository owner (Git username). (Required)SOURCE_REPO_NAME
: The name of the source repository. (Required)SOURCE_REPO_ID
: The ID of the source repository. (Required for GIT_TYPE="gitlab")SOURCE_PROJECT_NAME
: The name of the source project. (Required for GIT_TYPE="azure")SOURCE_PROJECT_ID
: The ID of the source project . (Required for GIT_TYPE="bitbucket")SOURCE_BASE_BRANCH
: The base branch that is to be tested in the source repository. (Required)SOURCE_RELATIVE_DIRECTORY
: The relative directory path within the source repository.SAME_TARGET_DETAIL
: Set to true if the destination git details are same as source git details. Default value is "true", Supports "true" and "false". (Optional)DESTINATION_GIT_TOKEN
: A token for authenticating the specific destination Git repository. (Required if SAME_TARGET_DETAIL=false)DESTINATION_GIT_CLONE_URL
: the destination repo clone URL (Optional)DESTINATION_OWNER_NAME
: The name of the destination repository owner (Git username). (Required if SAME_TARGET_DETAIL=false)DESTINATION_REPO_NAME
: The name of the destination repository. (Required if SAME_TARGET_DETAIL=false)DESTINATION_REPO_ID
: The ID of the destination repository. (Required if SAME_TARGET_DETAIL=false and GIT_TYPE="gitlab")DESTINATION_PROJECT_NAME
: The name of the destination project. (Required if SAME_TARGET_DETAIL=false and GIT_TYPE="azure")DESTINATION_PROJECT_ID
: The ID of the destination project . (Required if SAME_TARGET_DETAIL=false and GIT_TYPE="bitbucket")DESTINATION_BASE_BRANCH
: The base branch in which the test code is to be pushed in the destination repository. (Required if SAME_TARGET_DETAIL=false)
# Open AI env vars
OPENAI_API_MODEL
: Specifies the type of the OpenAI API model.
Supported values: "gpt-4", "gpt-3.5-turbo" etc. ( This models availability depends on theOPENAI_API_KEY
).
Default value: "gpt-4". (Optional)OPENAI_API_KEY
: The API key for accessing the OpenAI API. (Required if AI_TYPE="openai")
# AZURE Open AI env vars
AZURE_OPENAI_ENDPOINT
: API Endpoint to access Azure Open AI. (Required if AI_TYPE="azure_open_ai")AZURE_DEPLOYMENT_NAME
: Name of Azure Open AI deployment (Required if AI_TYPE="azure_open_ai")AZURE_OPENAI_KEY
: The API key for accessing the Azure Open AI API (Required if AI_TYPE="azure_open_ai")AZURE_OPENAI_VERSION
: The version of Azure Open AI to be used. (Optional)
# Vertex AI env vars
VERTEX_PROJECT_ID
: The ID of the Vertex project. (Required if AI_TYPE="vertexai")VERTEX_REGION
: The region where the Vertex project is located (example "us-central1"). (Required if AI_TYPE="vertexai" and VERTEX_FINE_TUNE="true")VERTEX_BEARER_TOKEN
: The bearer token for accessing the Vertex API. (Required if AI_TYPE="vertexai")VERTEX_MODEL
: The name of the Vertex model to use. (Required if AI_TYPE="vertexai")
Supported values: "text-bison", "code-bison". (Required if AI_TYPE="vertexai")VERTEX_FINE_TUNE
: Indicates whether fine-tuning is enabled for the model.
Supported values: "true", "false". (Optional)
Default value: "false".
# Open Source AI env vars
OPEN_SOURCE_MODEL_ENDPOINT
: API Endpoint to access Open Source AI model. (Required if AI_TYPE="open_source_ai")OPEN_SOURCE_MODEL
: The name of the Open Source model to use. (Optional)
Supported values: "meta-llama/Llama-2-13b-chat", "HuggingFaceH4/starchat-beta".
# SageMaker Model env vars
SAGEMAKER_MODEL_ENDPOINT
: The endpoint where the sagemaker model is hosted.(Required if AI_TYPE=sagemake_model)
# Claude AI env vars
CLAUDE_AI_MODEL
: Specifies the type of the Claude AI model.
Supported values: "claude-3-opus-20240229", "claude-3-sonnet-20240229", and "claude-3-haiku-20240307 (Required if AU_TYPE = "claude_ai")CLAUDE_AI_API_KEY
: The API key for accessing the Claude AI API. (Required if AI_TYPE="claude_ai")
# Advanced AI env vars
AI_TEMPERATURE
: The AI Temprature to be used during test generation.(Optional; Default value is 0.6)
# Jira board env vars
JIRA_EMAIL
: The email address associated with your Jira account. (Required if TEST_TYPE="functional" and BOARD_TYPE="jira")JIRA_HOST_NAME
: The hostname of your Jira instance. (Required if TEST_TYPE="functional" and BOARD_TYPE="jira")JIRA_PASSWORD
: The password for your Jira account. (Required if TEST_TYPE="functional" and BOARD_TYPE="jira")JIRA_ID
: The Jira ID associated with your account. (Required if TEST_TYPE="functional" and BOARD_TYPE="jira")
# Azure board env vars
AZURE_ORG
: The organization associated with your Azure Devops account. (Required if TEST_TYPE="functional" and BOARD_TYPE="azure")AZURE_TOKEN
: The access token for authenticating with Azure Devops. (Required if TEST_TYPE="functional" and BOARD_TYPE="azure")AZURE_PROJECT
: The name of the Azure Devops project. (Required if TEST_TYPE="functional" and BOARD_TYPE="azure")AZURE_WORK_ITEM_ID
: The ID associated with the relevant work item. (Required if TEST_TYPE="functional" and BOARD_TYPE="azure")
# Log env vars
LOG_SOURCE
: origin from where the logs are generated. (Optional)
Default value: "elks".LOG_SOURCE_PATH
: location where the log files are stored. (Optional)LOG_ELASTICSEARCH_URL
: The URL for the Elasticsearch. (Optional)LOG_ELASTICSEARCH_USER
: A user name for accessing Elasticsearch resources. (Optional)LOG_ELASTICSEARCH_TOKEN
: Authentication used for accessing Elasticsearch resources. (Optional)LOG_ELASTICSEARCH_API_KEY
: The API key to authenticate and authorize access to Elasticsearch APIs. (Optional)
# Behavioural Test cases env vars
BEHAVIORAL_TEST_TYPE
: The type of behavioural tests. Supports gherkin. Supports "gherkin". (Optional)BEHAVIORAL_TEST_SOURCE
: location or the source of behavioural test source. Supports file, gitpath and url. (Optional)BEHAVIORAL_TEST_FILE_PATH
: Path of the source file if source is file/gitpath, Relative path in case of gitpath. (Optional)BEHAVIORAL_TEST_URL
: The URL of the source file if the source is URL. (Optional)
# API Spec env vars
API_SPEC_TYPE
: The type of API specification. Supports Swagger and Postman. (Optional)API_SPEC_SOURCE
: location or the source of api. Supports file, gitpath and url. (Optional)API_SPEC_FILE_PATH
: Path of the source file if source is file/gitpath, Relative path in case of gitpath. (Optional)API_SPEC_URL
: The URL of the source file if the source is URL. (Optional)
# API test env vars
HTTP_VERBS_FOR_TESTING
: The specific http methods to be tested for all APIs. Enter "," seperated values here.Default is "get,post,put,patch,delete", Supports all the combinations of http verbs. (Optional)REGEX_HTTP_ENDPOINTS_FOR_TESTING
: Regex string that matches to specific endpoints that needs to be tested. If empty, all the endpoints will be tested. (Optional)
# License env vars
ROOST_DOMAIN
: This is the Roost Domain, default value for this is app.roost.ai. Default is app.roost.ai. (Optional)ROOST_TOKEN
: Authentication token for RoostGPT CLI. (Required)TELEMETRY
: Send telemetry data to roost, no private information is shared. Default is "true", Supports "true", "false". (Optional)
# Additional vars
TEST_NAME
: The name of the test. (Optional)
Default value: "roost-test".ROOST_DIR
: The directory path for Roost. (Optional)
Default value: "/var/tmp/Roost/RoostGPT".LANGUAGE
: The programming language of the source code. (Optional)
Supported values: "java", "go", "python", "csharp", "node.js".
Default value: "java".AI_TYPE
: The AI model being used to generate tests. (Optional)
Supported values: "openai", "vertexai","open_source_ai","azure_open_ai", "sagemaker_model", and "claude_ai".
Default value: "openai".PACKAGES_TO_SCAN
: The packages to be scanned, supports single package or comma separated values.
( Required for JAVA ex. com.example.product or com.example.product, com.example.controller )ITERATION
: The iteration number to improve and run the test. (Optional)
Default value: "1".TRIGGER_ID
: The ID of the trigger. Unique id to identify multiple triggers. Default is epoch. (Optional)TIMEOUT
: The timeout duration, after this the test generation will automatically stop. (Optional)
Default value: "1 hour".TEST_TYPE
: The type of test to generate. (Optional)
Supported values: "unit", "functional", "api-spec-test", "integration".
Default value: "unit".RETRIGGER
: Indicates whether to re-trigger existing test generated that from where it left. (Optional)
Supported values: "true", "false".
Default value: "false".TRIGGER_ID
: Unique ID to identify triggers, default is epoch.BOARD_TYPE
: The type of board to be used. (Optional)
Supported values: "jira", "azure","none".
Default value: "jira".GIT_PR_URL
: Git PR URL of the generated test. (Optional)MAX_DEPTH
: Maximum depth of directories to look for test files, if set to -1 it will traverse all subdirectories. (Optional)TEST_FRAMEWORK
: Supports [pytest, unittest] for Python, [gotest] for Golang, [JUnit4, JUnit5] for Java, [nunit] for Csharp, [jest, mocha] for Node.js, [postman, artillery, rest-assured, karate] for [test_type: api-spec-test] or [test_type: integration]. (Optional)FUNCTIONS_TO_TEST
: list of function names to be tested, separated by comma. Name to be specified as module.[class.]method OR module.function. (Optional)USE_ASSISTANT
: Use the Assistant feature in Openai. (Optional)
# Improve Test env vars
IMPROVE_TEST
: This is used when we want to improve the generated test. (Optional)
Supported values: "true","false"
Default value: "false".FILE_PATH
: location of the file containing test. (Required ifIMPROVE_TEST
= true.)USER_CONTENT
: Content Provided by the user for improving the generated test. (Required ifIMPROVE_TEST
= true.)TESTSCRIPT_ENDPOINT
: The endpoint to be improved (Required if TEST_TYPE = "postman")
VS Code Extension
The Roost GPT VS code extension allows you to generate tests for your code using RoostGPT with just a click, straight from your VS Code workspace.
Download:
https://marketplace.visualstudio.com/items?itemName=RoostGPT.roostgpt
Installation:
To use the RoostGPT VS Code extension, you must have VS Code ready and installed in your system, as well as any dependencies that are required to run your code, as RoostGPT will often run its generated test code to improve it.
You can download and install VS Code for your Operating system here.
After VS Code is Successfully installed in your system, you can go ahead and download the Roost GPT VS Code extension from the VS Code marketplace, just simply search for Roost GPT in the extension store. Alternatively, you can download and install the VS code extension from here.
Once the extension is installed, you are ready to generate tests for your code.
Configuration:
Once the extension has been successfully installed in your system, you can then proceed with configuring the extension to start generating tests. This involves providing information that is required for test generation.
To configure the extension to use it, simply open the extension settings for Roost GPT, you can search for it in the extension store, or you can find it in the list of your installed extensions.
You can then set up the required values according to your workspace and needs. Following are the required fields which will be required for you to set no matter what:
Required Fields
-
Roost Token: you can get your roost token from my profile page in app.roost.ai. If you don't have a Roost token, you can sign up for a free trial and try out RoostGPT for free, using your organization email from here.
-
Roost Domain: Enter the Domain in which the provided roost token is active, the default value is app.roost.ai.
-
Timeout: Set the timeout for test generation (in hours), default value is 1.
-
Language: Select the language your workspace/source code is written in, currently supports Java, Python, Go, NodeJS, and C#.
-
Board Type: Select the type of scrum/kanban board, Required for functional tests. set as none other test types, supports none, jira and Azure boards. default value is none.
-
Iterations: Set the number of iterations for test generation, If you provide an iteration value greater than 0 then it will run the generated test cases and pass the error that occurred(if any) to the ai model then update the test case inside the same file and run again till the number of times of iteration and stop if it ran successfully in between. The default value is 2.
-
Telemetry: Set as False if you do not want to send telemetry data to roost. The default value is true.
-
Generative AI model: Select which model to use for test code generation, which supports OpenAI, Google Vertex, Azure Open AI, Claude AI, and Hosted Open source Models (LLAMA2 and starchat).
- ProvideInput: Set as true if you want to provide your own input before test generation, you will be prompted for input before the test generation begins. The default value is false.
- MaxDepth: This is used to specify how deep into the workspace the extension will traverse to scan for files to generate tests for. default value is traverse to all subdirectories.
- EnvFile: This can be used to provide the path to an env file, which will provide RoostGPT with env file with user environment variables which will be taken into account in the test generation progress. You can keep this field empty.
-
AI model Details:
- If the Generative AI model is Open AI:
- OpenAI API Key: Provide your OpenAI API key if you plan on using an OpenAI as the generative AI model to generate your test cases.
- OpenAI API model: Provide the AI model the provided API key has access to, supports gpt-4, gpt-3.5-turbo, and gpt-3.5-turbo-16k.
- OpenAI API Key: Provide your OpenAI API key if you plan on using an OpenAI as the generative AI model to generate your test cases.
- If the Generative AI model is Google Vertex:
- Vertex Bearer Token: Provide your vertex Bearer Token if you plan to use Google Vertex as the generative AI model to generate your test cases.
- Vertex Project ID: Provide the ID of your Google vertex project.
- Vertex Region: Enter the region where your vertex region is present
- Vertex Model: Select the Vertex model to be used for code generation; supports text-bison, code-bison, and codechat-bison Default value is text-bison.
- Vertex Bearer Token: Provide your vertex Bearer Token if you plan to use Google Vertex as the generative AI model to generate your test cases.
- If Generative AI model is Claude AI:
- AI model: Select the AI model to be used from the dropdown menu.
- API Key: Provide your Claude AI API key.
- If the Generative AI model is Azure Open AI:
- API Key: Provide the API Key for your Azure Open AI model.
- API Endpoint: Provide the API Endpoint where your Azure Open AI model is hosted.
- Deployment Name: Enter the Deployment Name for your Azure Open AI API model.
- If the Generative AI model is Open Source:
- Open Source Model Endpoint: provide the endpoint for the open source model if you plan on using one of the roost provided open source models, you need to provide it in the format 'http://MODEL_IP:5000/generate' where MODEL_IP is the IP address for the instance where you have the model's container running.
- Open Source AI model: Select the AI model to be used for test generation, supports meta-llama/Llama-2-13b-chat, and HuggingFaceH4/starchat-beta. The default value is meta-llama/Llama-2-13b-chat.
- Open Source Model Endpoint: provide the endpoint for the open source model if you plan on using one of the roost provided open source models, you need to provide it in the format 'http://MODEL_IP:5000/generate' where MODEL_IP is the IP address for the instance where you have the model's container running.
- If the Generative AI model is Open AI:
Test Generation:
Once your extension configuration is complete, you can then start using the VS Code extension to generate tests for your workspace. To generate tests, simply right-click on a file in your Explorer menu and select the type of test you want to generate from the context menu that shows up. Note that each test type has some requirements to start test generation.
The test types currently supported are as follows:
- Unit Tests.
- API Tests.
- Functional Tests.
- Integration.
Test Requirements
Following are the Required Fields, and other instructions for test generation according to each supported test type:
-
Unit Tests: To generate Unit tests, just simply select the directory your file is present in, right-click on a file and select unit test generation, and then select the test framework you want to use from the popup. It will generate unit tests for all the files present in that file's parent directory. No extra fields other than the above-mentioned required fields are needed for unit test generation. Make sure that the language set in the extension settings matches the language your source code is in.
-
If you want the tests to be generated for only a few specified functions and not for the entire codebase, then provide the function names for which you want the tests to be generated in the FunctionsToTest input box in the advanced section of the extension settings in a comma separated fashion (e.g. Func1, Func2,...). This will ensure that tests are generated only for the specified functions.
-
For Java unit tests, please make sure to trigger the test execution from a valid module that contains a valid pom.xml, or generate tests for the entire project if pom.xml is present in the workspace folder.
- For React Unit tests, you need to use GPT-4 turbo model and set use assistant as true in the advanced section of the vs-code extension settings.
-
-
API Tests: To generate API tests, you need to right-click on Postman collection json or swagger API spec file and then select the Generate API tests. If you choose any file other than your API spec file, the test generation will fail. Then you need to select the test framework to be used for test generation (artillery, postman, or rest-assured) from the provided popup. No extra fields other than the above-mentioned required fields are needed for API Tests. For API tests, you can also filter out which HTTP verbs(such as post, get, etc.) will be tested by changing the HttpFilters setting from the advanced section. Please note that if you select postman as the test framework you will need newman cli installed in your system in order to run generated tests from the RoostGPT extension, you can install newman cli by using the command: npm install -g newman.
-
If you want to generate API tests for some specific HTTP verbs (get, post, put, patch, delete, etc.) then you can select which specific verbs need to be tested in the advanced section of the extension settings under the HTTP filters attribute.
-
If you want to generate API tests for some specific API endpoints matching a given regex pattern, you can set the regex pattern in the advanced tab of the extension settings under the HTTP endpoints for testing attribute.
- For karate and rest-assured tests, make sure that the Test generation is triggered from within a valid java/maven repo, i.e. put the api spec file in the java repo and put start the test generation from there.
-
-
Integration Tests: When generating Integration tests, you need to right-click on your Postman collection json or swagger API spec file and then select the Integration Tests option. If you choose any file other than your API spec file, the test generation will fail, after selecting the option, then need to select the test framework to be used for test generation (artillery, postman, or rest-assured), then you need to select the type of your gherkin template you can either select file and browse to your gherkin template file or you can choose URL and provide the URL to your Gherkin template. No extra fields other than the above-mentioned required fields are needed for Integration Tests. Please note that if you select Postman as the test framework you will need newman cli installed in your system to run generated tests from the RoostGPT extension, you can install newman cli by using the command: npm install -g newman. For karate and rest-assured tests, make sure that the Test generation is triggered from within a valid java/maven repo, i.e. put the api spec file in the java repo and put start the test generation from there.
- Functional tests: To generate functional tests, you need to select your board type to be either JIRA or Azure and Then the Below details are also required:
- If the Board Type is Jira:
- Jira Email.
- Jira Token.
- Jira Hostname.
- If the Board Type is Azure:
- Azure Org.
- Azure Token.
- Azure Project.
- If the Board Type is Jira:
Improve and Analyze Generated Tests
After the test generation process is complete, a side panel will open, showing you all the generated test files, you can select the file you want to view by using the dropdown provided, if you want you can also edit the files in the panel itself and save your changes using the provided save button.
If you want to run the generated tests, you can do so from the provided run button in the side panel, doing so will run the selected test file. NOTE that you will need to have all the dependencies required for running the tests installed in your local system for test generation to take place. and for artillery tests, after you click the run button, you will be prompted to enter the target URL for the tests, if you want to provide a target URL, please provide so in the input box, and then you will be asked if you want to upload a .env file to provide environment variables, if yes then you can upload the env file for the same.
If you are not satisfied with the generated tests and want some improvements or changes in the test, then at the bottom of the side panel, you will find a feedback prompt, enter the feedback prompt that you want to give to the AI model and then click the improve button, this will trigger the test improvement.
Public SaaS
RoostGPT UI :
Access Public SaaS at https://app.roost.ai/login
Roost GPT allows the user to automate their test against their code repository.
Below is the UI structure of RoostGPT which has two header tabs Test and Events.
-
The "Events Tab" shows the individual test generation trigger details and is explained in detail later.
- The table under "Test Tab" shows details such as
- Test Name,
- Created By,
- GenAI Model used for test generation ,
- Test Type weather it is Unit Test, Functional Test, API(using swagger) and API(using source code)
- Test create date.
- Actions are available in "Test Tab"
- to trigger the test generation,
- to view the test configuration or workflow and
- to delete the test
- Search box on the page allows to search tests by name
- Project Admins can view and edit anyone's tests workflows
- Regular project members can view only their test workflows
View of the Test Tab
Add Test
In case of Add Test it opens up the below page, which has 5 sections ->
- Provide a Test Name
- Select the Test Type (Unit Tests , Functional and Non Functional Tests, (API using swagger), API(using source code) and integration tests.
- Choose GenAI Models which has OpenAI Model and VertexAI Model and field to input their respective tokens. After the token verification, model specific details will be available for selection, such as
- Select SCCS (Source Code Repositories) from available
- Github (Cloud and self hosted) ,
- Gitlab,(cloud and self managed),
- Azure Devops and
- Bitbucket(cloud and self hosted).
- After the SCCS token verification, you can provide the code repo, branch, language and their versions.
- Languages supported are Java, Python, C#, Go
- Optional Integration with ticketing tools like Jira and Azure DevOps is available.
- Jira requires email address associated with jira account, jira hostname and the access token for your jira account.
- Azure Devops requires the Organization Name associated with Azure Devops account, access token for authenticating with Azure Devops and the Project Name for Azure Devops.
5. In Advance Timeout (in hrs) can be specified, for which the time the test will be on triggered mode
On click of save Button you can save your test and see the test you saved in the test header section. On click of download icon configuration for the test will be downloaded.
Workflow View
This will show you the workflow of your test, details you filled while adding your test.
Events View
Events view contains status of all the triggered test. Event Status filter shows the status of these events weather it is in Progress, timed out, aborted, completed or failed and the fields in the table show the test name with the status of the events and the status icons also the information of the repo, type of test, creation and completion time, user Information and action icons for re-triggering the test again, logs view and the insights
Logs View
Logs View contains the Event Information, AI Model Information and the Test Result Information having PR URL of your generated test on the left side and the Logs View container on the right side which has search Bar for searching in the logs and copy icon and download icons to copy and download logs respectively.
Insights View
Insights view gives you the details about your test, here it will show your test files, their completion and creation time and corresponding action icons for Analyze and improve for further analysis and improvement of your tests and download , for downloading the test file.
on the Click of eye icon in Actions, it will show you the test Files and Corresponding Actions. In the Actions, on click of Analyze and Improve it will open the modal which will show you the test and ask for the feedback for improvement of the test.
On click of download icon, it will download the test file generated.
Self Hosted Solution
For a self-hosted cloud offering, please reach out to us at support@roost.ai. We'd be delighted to assist you with your specific requirements.
RoostGPT Permissions
This document outlines the permissions and licenses related to the use of third-party components within RoostGPT.
Introduction
RoostGPT integrates with various third-party components to enhance its functionality and provide a comprehensive user experience. We value transparency and wish to provide clarity on the permissions required to ensure seamless integration with these external components.
Third-Party Components and Permissions
1. Component Name: Git
- Description: Git is a distributed version control system.
- Supported Types: Github, Gitlab, Bitbucket, Azure DevOps (Both Cloud and Self Hosted)
- Purpose in Our Software: RoostGPT uses Git for reading the source code and create automated test for it and create the PR in the same repo for the tests.
- Required Permissions:
- User Read.
- Repo Read.
- Repo Write.
- Create Commit.
- Create PR.
2. Component Name: Jira/Azure Board
- Description: It's used by development teams to track bugs, enhancements, tasks, and other kinds of issues throughout the software development lifecycle.
- Purpose in Our Software: RoostGPT fetch tickets based on commits message and use that as a acceptance criteria for the automated test generation.
- Required Permissions:
- API Access
- Tickets Read.
- Comments Write
3. Component Name: Log Server
- Description: A log server is a centralised system designed to collect, store, and manage logs from various sources, including applications, systems, devices, and network infrastructure. By consolidating log data in one place, log servers facilitate easier monitoring, analysis, and troubleshooting.
- Supported Types: LogStash, Log File
- Purpose in Our Software: RoostGPT fetch logs from the log server based on the input request to enhance the test generated by using the real life data.
- Required Permissions:
- API Access
- Logs Read.
4. Component Name: AI Model
- Description: An AI Model is an advanced computational model trained on vast amounts of text data to understand and generate human-like language. Leveraging deep learning techniques, LLMs can comprehend context, answer questions, and assist in various language-related tasks, showcasing the pinnacle of natural language processing capabilities.
- Supported Types: OpenAI, Vertex AI, Azure OpenAI, LLAMA2, Starchat and other open source models.
- Purpose in Our Software: RoostGPT uses AI model to generate automated test case for the source code.
- Required Permissions:
- API Access.