Skip to main content

Getting Started

Welcome to Spark, the framework to run your integration tests. Within this section you can find a number of pages with instructions on the various functionalities to use Spark to its full capabilities.

Latest versions

nodejs
2023.1.2
ruby
2023.1.2

Setup

To start using Spark you should first follow the setup instructions for your operating system and which programming language you would like to use between Javascript and Ruby. Our docs include instructions for MacOS, Windows and Linux (Fedora and Ubuntu).

Creating a project

Once you have setup your device with the neccessary packages for Spark you are almost ready to start running tests. Before this you will need to create a project. You can do this via Flare. Then use a Terminal or Command Prompt session at the root of the project to start running tests.

Running tests

To run tests depending on your language use the commands below.

cucumber features

Configuration

To change what your tests run on you can refer to the Spark configuration docs, the .yml files within the default_config directory in your project are used to tailor the test runs to your needs.

For example, for the browserstack integration.

default_config/browser.yml
source: browserstack

You can configure the capabilities from browserstack you would like to change in the browserstack section of the browser.yml This integration would also require you to add your credentials to the accounts.yml.

default_config/accounts.yml
browserstack:
username: Your Username
access_key: Your Access Key

Writing your own tests

When you are ready to start writing your own tests, you should refer to the page object tutorial. We advocate the use of a page object model code structure in line with the Selenium Webdriver best practice recommendations. Within the Tutorials section, there are a number of walkthroughs on the use of the various vendor integrations we provide, for cross browser, cross device, visual or accessibility testing.

For a greater understanding and assistance with writing tests, the Code section of the Spark documentation has a number of guides covering the different types of tests you can undertake with Spark.