Skip to content

ntslive/SMAPI-PythonSelfTest

Repository files navigation

---------------------
Sonos SMAPI Self-Test
---------------------

Requirements
------------

   Python v2.7.x

   3rd Party Python Packages:
   - lxml (3.3.5): https://pypi.python.org/pypi/lxml
   - PIL (2.5.3): https://pypi.python.org/pypi/Pillow
   - requests (2.5.1): https://pypi.python.org/pypi/requests
   If pip (a Python package manager for use with the Python Package Index) is set up
   (https://pip.pypa.io/en/latest/installing.html), these can be downloaded by running
   this command from “[Selftest Directory]/smapi/content_workflow/”:
	“pip install -r requirements.txt”

   Other 3rd Party Packages:
   - OpenSSL (latest version) (https://www.openssl.org/) (This does not come pre-installed on Windows)

   The computer running the test suite must have internet access because some
   additional resources will be downloaded from the web.


Optional Packages
-----------------

   3rd Party Python Packages:
   - psutil (2.2.1): https://pypi.python.org/pypi/psutil
     This package is used to print out the connections used by OpenSSL for SSL validation
     Can be installed with 'pip install psutil==2.2.1'


Package
-------

   Download the package from http://musicpartners.sonos.com and extract
   it to a new directory.

   Verify the directory tree as below:
      sonos-selftest
      -- smapi
         -- content_workflow
         -- service_configs
         -- utilities

   The sonos_selftest folder includes the smapi folder and all Python
   "egg" packages. The smapi folder includes a content_workflow folder
   which contains all test modules, a sonos.wsdl file; it also includes
   service_config folder which contains the smapiConfig.cfg file.  The
   utilities folder is empty and will be populated when the test runs.


Configuration File
------------------

   Before the test suite will run you must configure the file
   smapi/service_configs/smapiConfig.cfg. This configuration file is broken
   into sections which represent various functionalities supported by the
   service. All fields applicable to a service should be filled out.  Leaving
   any applicable fields blank may minimize test coverage or even prevent
   execution of the test suite.

   [Service Name]
   serviceName = YourServiceName
   serviceId =
   (Comment: In this section, enter the name of your service as it 
             appears in your Sonos Labs application.  Leave the 
             serviceId entry blank as this is used for internal 
             testing by Sonos.  This section is required.)

   [Endpoints]
   insecure = http://YourServiceEndpointURL
   secure = https://YourServiceSecureEndpointURL
   (Comment: In this section, you will supply two endpoints used to 
             access your service.  Please enter the same endpoints 
             that you supplied in your Sonos Labs application.  The 
             self-test will uses these endpoints to send SOAP 
             requests to your service.  This section is required 
             and both entries must be filled out.)

   [Strings File]
   stringsLocation = http://www.yourservice.com/static/strings.xml
   (Comment: In this section, you will supply the location of your 
             strings.xml file.  Please enter the same URL that you 
             provided in your Sonos Labs application.  The 
             self-test will run a set of tests that validate your 
             strings file against an xsd.  This section is required 
             if you supply this file in your application.  If you do
             not supply a strings file in your application you can 
             leave this section blank.)

   [Presentation Map File]
   pmapLocation = http://www.yourservice.com/static/pmap.xml
   (Comment: In this section, you will supply the location of your 
             presentation map xml file.  Please enter the same URL 
             that you provided in your Sonos Labs application.  The
             self-test will run a set of tests that validate your 
             presentation map file against an xsd.  If you do not 
             supply a presentation map in your application you can
             leave this section blank.  NOTE: This test requires that
             you have the lxml toolkit installation.  Please see the 
             README file for installation instructions.  This section
             is required if you supply this file in your application.) 

   [Authentication Type]
   sessionID = False
   anonymous = False
   stateless = False
   oauth: False
   (Comment: In this section, you will set to "True" the type of 
             authentication your service uses.  This should match what 
             you enabled in your Sonos Labs application.  The self-test 
             will use this information to determine how to authenticate 
             with your service.  Set the authentication your service uses 
             to "True" and leave all other entries as "False".  In the 
             example above, the service uses sessionID authentication.  
             This section is required.)

   [Device Link]
   householdId = any_name_householdid
   token = any_name_token
   privateKey = any_name_privatekey
   migrationToDeviceLink: False
   (Comment: This section is only used by services that use OAuth/DeviceLink
             authentication.  This information is how our tests will actually
             authenticate with your service.  The easiest way to fill this 
             out is to use something like SOAP UI to send a getDeviceLinkCode
             request with your householdId from the previous section.  Then 
             you can authenticate with your website and then send a
             getDeviceAuthToken request with the householdId and linkCode.
             Take the token in the getDeviceAuthToken response and enter
             it for the token value.  Enter the same householdId and if your 
             service also returns a privatekey, enter that as well.  This 
             section is required only for those services that use 
             OAuth/DeviceLink authentication.  If your service is migrating from
             SessionId to DeviceLink authentication, set migrationToDeviceLink to True.
             You must also provide a supported account within the accounts section
             below.  We will use this account to verify the correct SOAP Fault when
             trying to call getSessionId after the migration has been completed.)

   [Accounts]
   supportedAccount = accountname@foobar.com
   supportedPassword = supportedpassword
   unsupportedAccount = unsupportedaccountname@foobar.com
   unsupportedPassword = unsupportedpassword
   expiredTrialAccount = expiredaccountname@foobar.com
   expiredTrialPassword = expiredpassword
   unsupportedTerritoryAccount =
   unsupportedTerritoryPassword =
   (Comment: This section is used by services that use sessionID 
             authentication.  The tests use the different account types
             to test different authentication scenarios.  Supported 
             Accounts are used to authenticate and run most tests.  
             Unsupported, expired and unsupported territory accounts 
             are used to test error conditions and messaging.  If your
             service only has one type of account, just add account 
             information in the supportedAccount entry.  If your service
             has multiple tiers of accounts and only one of those is 
             supported on Sonos, add an unsupported account.  If trial accounts
             can expire, add an expired trial account.  If you have territories
             that are not supported on Sonos, add an unsupported territory
             account.  Leave those account types you do not have blank.  
             In all cases, use the accounts you have supplied in the Sonos
             Labs application.  In the example above, the service has more 
             than one account type and has supplied the account information 
             for a supported account, an unsupported account and an expired
             trial account.  The service does not have unsupported territory
             accounts so they left that entry blank.  This section is 
             required for those services that use sessionID or Stateless 
             authentication.)

   [Capabilities]
   search = True
   favoritesTracks = True
   favoritesAlbums = False
   favoritesArtists = False
   userContentPlaylists = True
   trialMerge = False
   playbackLogging = False
   socialMessaging = False
   extendedMetadata = True
   eventAndDurationLoggingDuringPlayback = False
   accountLogging = False
   (Comment: This section is used to distinguish the features of the service.  
             The test suite uses the information in this section to determine
             which tests to run or skip based on the feature set of the 
             service.  The capabilities listed match those found in your Sonos
             Labs application.  Set to "True" those capabilities that you 
             enabled in your Sonos Labs application.  Leave those capabilities 
             you did not enable set to "False".  In the example above, the 
             service supports search, favoriteTracks, userContentPlaylists 
             (playlist editing) and extendedMetadata.  NOTE: extenededMetadata
             is required and must be set to "True" and favoriteArtists is not 
             supported and should remain False.  This section is required for 
             all services.)

   [Additional Settings]
   HLSContent = False
   ephemeralTrackId = True
   ephemeralArtistId = False
   ephemeralAlbumId = False
   ephemeralStreamId = False
   ephemeralProgramId = False
   (Comment: This section is used to identify certain attributes of the service
             so that the selftest can run in an accomodating manner if necessary.
             The test suite uses the information in this section to determine if
             certain tests need to be run and to determine if test content should
             be updated dynamically as the tests run (sometimes IDs expire once
             playback is completed as a security measure).  The first flag,
             "hlscontent", designates whether or not hlscontent is present on the
             service. The flags listed below which are prepended with "ephemeral"
             identify what content types are ephemeral (frequently
             changing/expiring) and need to be updated dynamically throughout the
             tests. In the example above, the service has ephemeral track IDs but
             all other content types are constant/non-changing and HLS content is
             not present in the service.
             This section is required for all services.)

   [Favorite Containers]
   faveArtists =
   faveAlbums =
   faveTracks = user:favorite:tracks
   faveOther =
   (Comment: This section is used only for those services that support adding
             favorites using the Sonos controller. The self-test will attempt 
             to add and remove the supported content type using createItem and 
             deleteItem calls.  The test will then check the appropriate favorites
             container to verify the content was successfully added and removed.  
             In this section, add the id to the container that will house the
             favorites content.  For instance if all favorite tracks are added to 
             the container with an id of "user:favorite:tracks", I would put that 
             ID in the faveTracks entry.  If a user can create a station from a 
             streaming track, I would put the id of the container the newly created 
             station is placed in. Leave those entries your service does not 
             support blank.  In the example above the service only supports 
             favoriting tracks.  NOTE: faveArtists should remain blank as that 
             functionality is not supported on Sonos at this time.  This section 
             is required only for those services that support adding favorites 
             using the Sonos controller.)

   [Search Containers]
   searchArtist = service*search*artist
   searchAlbum = service*search*album
   searchTrack = service*search*track
   searchStream =
   searchProgram =
   searchPlaylist =
   searchOther =
   (Comment: This section is used only for those services that support search.  
             Enter the ids for your search categories.  These are the ids that
             are returned when running a getMetadata(search) call.  The self-test 
             will use these ids to run a series of search requests against
             each search category.  The searchOther entry is for any type of search 
             that is not covered by artist, album, track, stream, or program.  In 
             the example above, the service supports searching on artists, albums 
             and tracks.  This section is required for all services that support 
             search.)

   [Test Content]
   track = service_supported_track_id
   track title = corresponding_track_name
   album = service_supported_album_id
   album title = corresponding_album_name
   artist = service_supported_artist_id
   artist name = corresponding_album_name
   stream =
   stream title =
   program = service_supported_program_id
   program title = corresponding_program_title
   other =
   other title =
   playlist = service_supported_playlist_id
   playlist title = corresponding_playlist_title
   (Comment: This section is used by the test to grab sample content
             without having to crawl your service to get applicable IDs.
             Each content type entry is an ID that maps back to content
             in your service and the corresponding title associated with
             that piece of content.Please include IDs and titles for all
             the content types you support. The test will use this sample
             content for browse requests, search queries and validation,
             favorites functionality if applicable, verifying that metadata
             requests conform to Ratings requirements if applicable, and
             SMAPI Logging (event and duration logging during playback
             and/or accountLogging) if applicable. For those content types
             you do not support, leave the entry blank. If your service
             supports searching something other than artist, album, tracks,
             streams or programs, please include that sample content under
             other.The example above indicates the service supports tracks,
             artists, albums and program radio. This section is required
             for all services.)

   [Playlist]
      playlist folder =
      playlist share =
      (Comment: This section should be filled out if the service supports
                userContentPlaylists or subscribing to other user's playlists.
                If your service supports userContentPlaylists, the playlist
                folder entry should be the ID of the container that houses
                user editable playlists. Playlist share is used if your
                service supports subscribing to other user's playlists. If
                your service supports subscribing to other user's playlists,
                please add the ID of a subscribed to playlist here. This
                section is optional.)

   [ExtendedMetadata Text]
   textArtist = ARTIST_BIO
   textAlbum =
   textTrack =
   textProgram =
   textStream =
   textOther =
   (Comment: This section should be filled out only if the service 
             implementation returns a relatedText node in the 
             getExtendedMetadata response. You will enter in the type that you
             have configured in your getExtendedMetadataResponse.  In the 
             example above the service can return an ARTIST_BIO ExtendedMetadata 
             Text type.  This is required for those service that return a 
             relatedText node in the getExtendedMetadata response for any of
             their supported content.)
   
   [ExtendedMetadata Browse]
   browseArtist = RELATED_ARTISTS
   browseAlbum =
   browseTrack =
   browseProgram = RELATED_PROGRAMS
   browseStream =
   browseOther =
   (Comment: This section should be filled out only if the service implementation 
             returns a relatedBrowse node in the getExtendedMetadata response. 
             You will enter in the type that you have configured in your 
             getExtendedMetadataResponse.  In the example below the service can 
             return a RELATED_ARTISTS and RELATED_PROGRAMS ExtendedMetadata Text 
             type. This is required for those service that return a relatedBrowse 
             node in the getExtendedMetadata response for any of their supported 
             content.)
             
   [Polling Interval]
   interval = 60
   (Comment: This section should be filled out with the same polling interval 
             value in seconds you entered in your Sonos Labs application.  This 
             section is required for all services.  In the example below the
             service has configured a 60 seconds polling interval.)

   [Validation Settings]
   strict = True
   (Comment: This option should be set to "True" to enforce WSDL compliance. This means that all responses
   will be validated against the WSDL.)

How to Run the Test Suite
-------------------------

   The python script suite_selftest.py includes calls to all functional test
   modules provided in the package with the exception of Device Link Test
   explained in the next section.  The suite_selftest.py script is located
   in the smapi/content_workflow directory and should be run from there.  By
   default the suite will load the configuration defined by smapiConfig.cfg
   located in the smapi/service_configs directory.

   Example:

      python suite_selftest.py
         (default configuration file 'smapiConfig.cfg' will be used)

   Optional Parameter --config:

      Override the default configuration file with the --config parameter.
      
      python suite_selftest.py --config yourname.cfg


How to Run an Individual Test Module
------------------------------------

   Tests can be run, one module at a time by launching a specific python
   module from the content_workflow directory.

   Example:

      python search.py
         (default configuration file 'smapiConfig.cfg' will be used)


Running Device Link Test
------------------------

   This is a semi-automated test that is executed in several steps.
   It is only relevant to services which use DevLink authentication.

   Example:

      python devicelink_manual.py --config yourname.cfg

   Steps:

    1. Ensure you have enabled device link authentication in your configuration
       file by setting the the oauth field to True under [Authentication Type].
       (Note: Make sure all other authentication types are set to False)
    2. Ensure you have filled in the householdId field under [Device Link]
    3. The device link code gets generated based on the householdId
       entered in the configuration file and is displayed to the tester in the
       console along with a registration URL.
    4. The tester will have ~8 minutes to navigate to the URL, log in and enter
       the registration code.  Meanwhile, the test will continue to ping every
       5 seconds for an authentication token.
       (Note: NotLinkedRetry faults are expected during this phase)
    5. Once the device link process has completed the key and token will be
       displayed to the tester in the console output.


Test Results
------------

   Test results are available in three files. In the case of executing
   a test module, a .txt and -DEBUG.log file will be generated
   corresponding to the name of the test module run. In the case of a
   executing the entire test suite, a .txt and -DEBUG.log file will be
   generated using the name of the config file used. In either case, a
   SummaryReport.html file will be generated.

   Example:

      Executing the test module search.py will generate:
         SummaryReport.html, SMAPI-Search-Validation.txt and
         SMAPI-Search-Validation-DEBUG.log

      Executing the suite will generate:
         SummaryReport.html, yourname.txt and yourname-DEBUG.log

   The SummaryReport.html and yourname.txt files contain identical
   data, formatted differently depending on usage. The
   SummaryReport.html should be submitted to Sonos when all tests
   pass.  If a failure is seen in one of these files, use the "TestID
   FixtureName TestCaseName" string to find the associated information
   in the DEBUG log. The DEBUG.log file contains information on every
   test case executed, as well as the XML request and response in a
   test. The XML for a test may be captured at any point of the test
   run, and not necessarily limited to the scope of the current test.
   To find the XML for a failing test, first find the failing test, and
   then read up the DEBUG.log until you find the appropriate XML.

   The live console output will display fewer details, indicating only
   the result of each test case and relevant XML responses.

   Each individual test case will be reported as one of the following:

      PASS:	No issues found in this test.
      FAIL:	The test encountered a response which was explicitly
            incorrect.
      SKIP:	The test was not executed because it was either not
            applicable to the service or the configuration file did
            not contain the information required by this particular
            test.
      WARN:	The test encountered some result which may indicate an
            issue with the service but did not explicitly fail the
            test.
      STOP:	The test encountered an issue which will not allow the
            remainder of the test case to execute.

   Only a test run which results in SKIP, PASS and WARN test outcomes
   will report an overall passing status. Because of this, any test
   cases which result in a STOP will be counted as failing and thus
   Fail the entire module or suite.

   Once all test cases are run, a summary will display the results of
   the full test module or suite (depending on which was run).
   This summary will contain:

      - The overall status (Pass or Fail)
      - The number of failing test cases
      - The number of passing test cases

   If a test module encounters a critical error and is stopped before
   any test cases are executed, the output will simply read:

      No summary, test stopped


Crawler
------------------
    The SMAPI Service Crawler dynamically grabs test content from your service and uses
    this content to fill in the [Test Content] fields in your configuration file.
    You can run the self-test suite and:
    	* set the Crawler to overwrite each field with new content
    	* set the Crawler to overwrite empty fields with new content
    	* set the Crawler to do nothing
	By default, the SMAPI Service Crawler is off.

    For example:

          To run suite_selftest.py and set the Crawler to overwrite each field with new
          content, type:
               python suite_selftest.py -writeContent 1
          OR:
               python suite_selftest.py -w 1

          This will overwrite the content from your service to the default configuration
          file, 'smapiConfig.cfg'

          To run suite_selftest.py and set the Crawler to overwrite empty fields with new
          content, type:
	          python suite_selftest.py -writeContent 0
          OR:
    	      python suite_selftest.py -w 0

          This will use content from existing fields in the default configuration
          file, 'smapiConfig.cfg' and add content from your service to the empty fields.

		  To run suite_selftest.py without the Crawler, type:
               python suite_selftest.py
          OR:
	           python suite_selftest.py -writeContent -1
          OR:
    	       python suite_selftest.py -w -1

          This will use content from existing fields in the default configuration file,
          'smapiConfig.cfg' and no changes will be made.

	You can also run individual tests with the Crawler with the same -writeContent(-w)
	options.


Additional Resources
--------------------

For help reading log files for debugging failed test case and to reproduce
test failures using SOAP tools please refer to 

http://musicpartners.sonos.com/?q=node/346

About

Publish the test suite for NTS SMAPI

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published