Unified Test Criteria

  • Uploaded by: Kannabiran
  • 0
  • 0
  • October 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Unified Test Criteria as PDF for free.

More details

  • Words: 3,210
  • Pages: 18
Unified Testing Criteria for Java(TM) Technology-based Applications for Mobile Devices

Version 1.4

28 September 2004

UTC_1_4.doc

Version 1.4

Page 1 of 18

DISCLAIMER. THIS DRAFT UNIFIED TESTING CRITERIA DOCUMENT ("DOCUMENT") IS FOR INFORMATIONAL PURPOSES ONLY. YOUR USE OF THIS DOCUMENT AND THE INFORMATION PROVIDED HEREIN IS AT YOUR OWN RISK. THE DOCUMENT IS PROVIDED ON AN "AS IS" AND "WITH ALL FAULTS" BASIS. MOTOROLA, INC. (BY AND THROUGH ITS "PERSONAL COMMUNICATIONS SECTOR"), NOKIA CORPORATION, SIEMENS AG, SONY ERICSSON MOBILE COMMUNICATIONS AB AND SUN MICROSYSTEMS, INC. (“INDUSTRY PARTNERS”) DISCLAIM ALL EXPRESS OR IMPLIED CONDITIONS, REPRESENTATIONS, AND WARRANTIES OF ANY KIND, INCLUDING ANY IMPLIED WARRANTY OR CONDITION OF MERCHANTABILITY, SATISFACTORY QUALITY, FITNESS FOR A PARTICULAR PURPOSE, OR NONINFRINGEMENT. THE INDUSTRY PARTNERS MAKE NO REPRESENTATIONS, WARRANTIES, CONDITIONS OR GUARANTEES AS TO THE USEFULNESS, QUALITY, SUITABILITY, TRUTH, ACCURACY OR COMPLETENESS OF THIS DOCUMENT. THE INDUSTRY PARTNERS MAY CHANGE THIS DOCUMENT AT ANY TIME WITHOUT NOTICE.

UTC_1_4.doc

Version 1.4

Page 2 of 18

Table of Contents 1 INTRODUCTION........................................................................................................................................................... 4 1.1 PURPOSE................................................................................................................................................................... 1.2 SCOPE...................................................................................................................................................................... 1.3 DEFINITIONS, ACRONYMS, AND ABBREVIATIONS................................................................................................................... 1.4 REFERENCES..............................................................................................................................................................

4 4 5 5

2 RETESTING.................................................................................................................................................................. 6 3 TEST CASE ORGANIZATION...................................................................................................................................... 7 3.1 ORGANIZATION............................................................................................................................................................ 7 3.2 TEST CATEGORY DESCRIPTIONS...................................................................................................................................... 7 3.3 PASS/FAIL CONDITIONS................................................................................................................................................. 8 4 MIDP TESTS................................................................................................................................................................. 9 4.1 APPLICATION LAUNCH................................................................................................................................................... 9 4.2 USER INTERFACE REQUIREMENTS.................................................................................................................................... 9 4.2.1 Clarity............................................................................................................................................................. 9 4.2.2 User Interaction............................................................................................................................................ 10 4.2.3 Settings/Sound............................................................................................................................................. 11 4.3 FUNCTIONALITY......................................................................................................................................................... 11 4.4 OPERATION.............................................................................................................................................................. 12 4.5 SECURITY................................................................................................................................................................ 12 4.6 NETWORK................................................................................................................................................................ 13 4.7 LOCALIZATION........................................................................................................................................................... 13 5 JTWI (JSR 185) TESTS.............................................................................................................................................. 14 5.1 MIDP 2.0 (JSR 118) TESTS.................................................................................................................................... 14 5.1.1 Security........................................................................................................................................................ 14 5.1.2 Operation...................................................................................................................................................... 15 5.2 WMA 1.1 (JSR 120) TESTS.................................................................................................................................... 15 5.2.1 Network........................................................................................................................................................ 15 5.3 MMAPI 1.1 (JSR 135) TESTS................................................................................................................................. 16 6 BLUETOOTH (JSR 82) TESTS................................................................................................................................... 17 6.1 NETWORK................................................................................................................................................................ 17 7 REVISION HISTORY................................................................................................................................................... 18

UTC_1_4.doc

Version 1.4

Page 3 of 18

1 Introduction 1.1 Purpose This document defines Unified Testing Criteria for Java(TM) Technologybased Applications ("Java Applications") as developed by the specification's sponsors: Motorola Inc., Nokia Corporation, Siemens AG mobile, Sony Ericsson Mobile Communications AB and Sun Microsystems Inc (“UTI Members”). The sponsors have agreed on a process and a set of test criteria that are appropriate for applications developed for use on mobile handset devices manufactured by Motorola, Nokia, Siemens mobile and Sony Ericsson. The purpose of this document is to ensure that an application that has successfully passed the tests described by the criteria in this document and possible carrier-specific tests will be ready for deployment

1.2 Scope This document specifies testing requirements related to the operating characteristics of a Java application that runs on mobile devices such as handsets. The tests are organized by JSR and within a JSR by category. The categories of tests include application launch, usability, operation, etc. The testing is performed within a larger context. That is, some amount of pretesting will have occurred and some amount of posttesting will also occur. Both of these activities have no bearing on the tests described within this document (except where pretesting determines that certain JSR requirements are being violated, in which case the application must be corrected before being submitted for testing). This document does, however, acknowledge the presence and the importance of pretesting and posttesting mobile applications. The document does not address the following:

UTC_1_4.doc



Content censorship (i.e. assessment against standards for violence, gambling, political messaging etc.) for the purpose of preventing the deployment or sale of an application.



Distribution, DRM etc.



Testing requirements specific to a particular manufacturer’s (or network operator’s) device, user interface, and standards (e.g. WAP) implementation. Version 1.4

Page 4 of 18

1.3 Definitions, Acronyms, and Abbreviations All trade marks are acknowledged. Acronym API DRM J2ME™ MIDP OTA WAP

Definition Application Program Interface Digital Rights Management Java™ 2 Platform Micro Edition Mobile Information Device Profile Over The Air Wireless Application Protocol

1.4 References

UTC_1_4.doc



Motorola J2ME Generic Test Guide Version 1.0



Motorola A830 Certification Developer Guide Version 1.05



Nokia OK MIDP application guidelines for Games Version 1.1



Nokia OK MIDP Application Requirements 19-09-2002



Developer Check List for J2ME Applications [Forum Nokia]



Siemens mobile Optimized Test for J2ME Version 1.0



Sun Mobile Certification Test Criteria

Version 1.4

Page 5 of 18

2 Retesting There are certain conditions that require that an application be tested more than once. For example, if the version of the firmware in the device were upgraded to add functionality and the application needed to run on both versions then a retest would be required. Not all tests in this document need to be applied to an application that is getting retested - only those that would be most likely affected by a change. In the rest of this document, those tests that would be part of a retest are marked with a “(R)” on a different colored background. For example:

Test Identifier Summary

AL1 (R) The application must install via OTA All other tests need not be re-applied.

UTC_1_4.doc

Version 1.4

Page 6 of 18

3 Test Case Organization 3.1 Organization The manual test cases, which follow this section, are organized by JSR. In some cases a JSR will consist of other JSR's. For example, JSR 185 (JTWI) consists of three other JSR's. Within a group of JSR-related tests they are further organized by category. The definitions of the test categories are shown below. The current test organization of JSR's, as of this version (1.0), are: ·

MIDP - Common tests for MIDP 1.0 (JSR 37) MIDP 2.0 (JSR 118)

·

JTWI (JSR 185)

·

·

MIDP 2.0 (JSR 118) – Tests specific to MIDP 2.0

·

WMA 1.1 (JSR 120)

·

MMAPI 1.1 (JSR 135)

Java APIs for Bluetooth (JSR 82)

A note on test identification: With the exception of the MIDP section, the identifier for the individual tests are constructed this way: -<JSR #>-nn where Category is one of SE, AL, LO, FN, etc.(see section 5.2), JSR # is the JSR number to which this test applies and nn is a sequential number starting at 01. For example: SE-120-01.

3.2 Test Category Descriptions Application Launch (AL) – Once an application is loaded it must start (launch) and stop correctly in relation to the device and other applications on the device. User Interface (UI) - The intent is to not specify exactly how to design a user interface but rather to give general guidelines. It is expected that publishers and network operators will further define the look and feel of an application's user interface to make it more in conformance with their overall look and feel. Functionality (FN) - Documented features are implemented in the application and work as expected. Sources for the information are user manuals, formatted application specification documents and online documentation.

UTC_1_4.doc

Version 1.4

Page 7 of 18

Operation (OP) - When an application has been started these set of tests determine how well an application interacts with other elements of the device – both hardware- and software-based. Security (SE) - Applications that are network-enabled are tested for secure transmissions and the handling of network failures. Network (NT) - If an application is network-enabled then it must demonstrate its ability to communicate over a network correctly. It must be capable of dealing with both network problems and server-side problems. Localization (LO) - Applications that are to be deployed to localities other than their point of origin must account for changes in language, alphabets, date and money formats, etc.

3.3 Pass/Fail Conditions It is expected that an application must pass all the tests in each test category to be successful. Each test has an equal rating, so no scoring system is needed.

UTC_1_4.doc

Version 1.4

Page 8 of 18

4 MIDP Tests The tests in this section are for applications that use either version 1.0 (JSR 37) or 2.0 (JSR 118) of the Mobile Information Device Profile (MIDP) applications. That is, they represent a common set of tests that can be applied to applications that use either version of the MIDP. The tests are organized by test category. A description of the categories can be found in section 3.2.

4.1 Application Launch Test Identifier Summary

AL1 (R) The application must install via OTA

Test Identifier Summary

AL2 (R) The application must launch and exit properly.

Test Identifier Summary

AL3 (R) The application must launch within 15 seconds. Consideration will be given for applications that are subject to Digital Rights Management or other types of verification.

4.2 User Interface Requirements 4.2.1 Clarity Test Identifier Summary

UI1 (R) All screen content must be clear (e.g. screen not crowded with content) and readable to the naked eye regardless of information displayed, or choice of font, colour scheme etc.

Test Identifier Summary

UI2 (R) Each screen must appear for the time necessary to read all its information

UTC_1_4.doc

Version 1.4

Page 9 of 18

Test Identifier Summary

UI3 If text is truncated then it must be understandable by the target user group. For example: Parts of the letters must not be cut horizontally. Items in lists may be shortened as long as different options remain understandable.

4.2.2 User Interaction Test Identifier Summary

UI4 The main functionalities of Exit, About and Help must be accessed easily through a Main Menu.

Test Identifier Summary

UI5 The application’s user interface should be consistent throughout, e.g. common series of actions, action sequences, terms, layouts, soft button definitions, vibra and sounds.

Test Identifier Summary

UI6 Where the application uses menu or selection items, the function of the selection and menu items must be clearly understandable to the user. Further, each menu or selection item must perform a valid action (i.e. no menu orphans.)

Test Identifier Summary

UI7 Sequences of actions (e.g. submitting a form) must be organised into groups with a beginning, middle and an end. Informative feedback must be provided on the completion of a group of actions.

Test Identifier Summary

UI8 (R) The speed of the application is adequate and it does not compromise the application’s use. The performance of the application is acceptable.

Test Identifier Summary

UI9 Any error messages in the application must be clearly understandable. Error messages must clearly explain to the user the nature of the problem, and indicate what action needs to be taken (where appropriate).

Test Identifier Summary

UI10 The user should be able to pause (e.g. for games) and resume the application easily.

Test Identifier Summary

UI11 Easy reversal of actions must be offered, or the user informed that an action is irreversible (e.g. over-writing an entry in database)

Test Identifier Summary

UI12 The number of screens a user has to browse through must be minimal

Test Identifier Summary

UI13 (R) Any selection of a different function in the application should be performed within 5 seconds. Within 1 second, there must be some visual indication that the function will be performed. The visual indication can be prompting for user input, using splash screens or progress bars, displaying text such as “Please wait...”, etc.

UTC_1_4.doc

Version 1.4

Page 10 of 18

4.2.3 Settings/Sound Test Identifier Summary

UI14 The application must provide user with mute/off setting for background music and/or sound effects.

Test Identifier Summary

UI15 All sounds should have a specific function, and should not be over used (e.g. game completing with a minute of random noise is not permitted.)

Test Identifier Summary

UI16 (R) The current status of the settings does not affect the use of the application (e.g. whether or not the sound is on in a game). For example, switching off the sound does not change the execution of the game.

Test Identifier Summary

UI17 The current status of each setting is clear: the application should make use of check boxes or by changing text.

Test Identifier Summary

UI18 Each setting has a separate enable/disable functionality (e.g. Vibra and Sound). There should be no combinations of settings, e.g., Vibra and Sound.

Test Identifier Summary

UI19 When an application exits, all settings must be saved. Restarting the application must restore these saved settings.

4.3 Functionality Test Identifier Summary

FN0 The application must do what it is meant to do (as specified in the Help file).

Test Identifier Summary

FN1 (R) An exit functionality is explicitly present in the application (e.g. in a Main Menu)

Test Identifier

FN2

UTC_1_4.doc

Version 1.4

Page 11 of 18

Summary

An additional menu item for About is recommended. If available the About section's data must be consistent with the information contained in the JAD. The JAD file and if available the About section must include the vendor name, MIDLet name and the version number of the MIDLet.

Test Identifier Summary

FN3 Help must be provided in the application. It should include: aims of the applications, use of keys (e.g. for games). If the text of the help is too long, it should divided into smaller sections and/or organised differently.

Test Identifier Summary

FN4 (R) The application must not (or attempt to) cause harm to system applications or data stored in the terminal.

4.4 Operation Test Identifier Summary

OP1 (R) If an application is interrupted by incoming events such as voice call, text message, or the posting of an error message then the application should resume gracefully once the the interrupt is over. This should be true whether the application pauses or continues to perform during the interrupt. The developer is encouraged to use the available API pause and continue methods.

4.5 Security Test Identifier Summary

SE1 (R) Use Encryption (if the system supports it) to send passwords or other sensitive personal data.

Test Identifier Summary

SE2 Sensitive data such as credit card details must not be stored locally by the application.

Test Identifier Summary

SE3 The application must not echo the input of sensitive data, e.g., pins and passwords. However, it is permitted that the chosen character can appear briefly in order to confirm what character has been entered; the character must then be masked.

UTC_1_4.doc

Version 1.4

Page 12 of 18

4.6 Network Test Identifier Summary

NT2 (R) If the application is network enabled, appropriate error messages must be displayed when the application attempts to send / receive data when data services are not available. The application should respond gracefully to the loss of network connectivity.

Test Identifier Summary

NT3 (R) The application should be able to handle delays. For instance when making a connection it may need to wait for permission from the user

Test Identifier Summary

NT4 (R) The application must be able to handle situations where connection is not allowed.

Test Identifier Summary

NT5 (R) The application must be able to close the connection which it's using after the session is over.

4.7 Localization Test Identifier Summary

LO1 Data format must be handled appropriately for the targeted country. The test will verify that all date, time, time zone, week start, numeric separators and currency, are formatted appropriately for the implemented language’s target country and supported throughout the application.

Test Identifier Summary

UTC_1_4.doc

LO2 Data entry fields must accept and properly display International characters. The test will verify that all data entry fields accept and properly display characters that are used in the target country. For example in Scandinavian countries: 'ä', 'ö' and 'å'..

Version 1.4

Page 13 of 18

5 JTWI (JSR 185) Tests JSR 185, known as Java Technology for the Wireless Industry (JTWI), consists of two required specifications: ·

MIDP 2.0 (JSR 118) – Mobile Information Device Profile.

·

WMA 1.1 (JSR 120) – Wireless Messaging API.

and one optional specification: ·

MMAPI 1.1 (JSR 135) – Mobile Media API

The tests in this section are organised by the JSRs above. Within a JSR section the tests appear within a category as defined in section 5.2.

5.1 MIDP 2.0 (JSR 118) Tests The MIDP 2.0 release, with its new and enhanced capabilities requires additional tests to be specified. Below are brief descriptions of these capabilities organized by the UTI's current testing categories. Not all categories are present.

5.1.1 Security A significant addition to the MID Profile is security. The security model has expanded from a strict sandbox model to a highly configurable set of permissions for accessing certain method calls. Permissions are organized into protection domains. Many of the tests center around the proper behavior of these domains. There are four domains: 1. Manufacturer 2. Operator 3. Trusted Third Party 4. Untrusted A MIDlet that has not been signed or whose signature cannot be verified and authenticated with any one of the Manufacturer, Operators or Trusted Third Party certificates in the device is considered Untrusted. For explanations and a deeper understanding of the new security model, please read the MIDP 2.0 specification. The security tests below are derived from the Chapter 15, The Recommended Security Policy for GSM/UMTS Compliant Devices, in the MIDP 2.0 specification. An application under test is monitored and if the following behavior is noticed then it will be reported in the test results. UTC_1_4.doc

Version 1.4

Page 14 of 18

Test Identifier Summary

SE-118-01 (R) A MIDlet must not be able to override security prompts and notifications to the user generated by the system or virtual machine.

Test Identifier Summary

SE-118-02 (R) A MIDlet must not be able to simulate security warnings to mislead the user.

Test Identifier Summary

SE-118-03 (R) A MIDlet must not be able to simulate key-press events to mislead the use.

5.1.2 Operation New features introduced in MIDP 2.0, such as the Push Registry, require additional testing in the Operation category.

Test Identifier Summary

OP-118-01 (R) All registered alarms and connections must be activatable under test.

Test Identifier Summary

OP-118-02 (R) All push-activated MIDlets must show some visual indication to the user that push activation has occurred.

5.2 WMA 1.1 (JSR 120) Tests JSR 120 is the specification for the Wireless Messaging API.

5.2.1 Network Test Identifier Summary

NT-120-01 Verify midlet successfully sends the SMS message. This test verifies that the WMA method is used correctly.

Test Identifier Summary

NT-120-02 Verify the message is formatted appropriately if being sent to the handset mailbox.

Test Identifier Summary

NT-120-03 Verify the SMS or CB message is received and processed correctly by the appropriate receiving application if the message is being sent to an application.

Test Identifier

NT-120-04

UTC_1_4.doc

Version 1.4

Page 15 of 18

Summary

Verify the application presents an accurate and appropriate error message to user, if the handset is unable to send the SMS message due to external factors (e.g. network connectivity, etc.).

5.3 MMAPI 1.1 (JSR 135) Tests /To be developed by the Members/

UTC_1_4.doc

Version 1.4

Page 16 of 18

6 Bluetooth (JSR 82) Tests JSR 82 defines Java APIs for Bluetooth-enabled J2ME devices.

6.1 Network Test Identifier Summary

NT-082-01 If Bluetooth can be turned on and off, do so. How does the application behave when these events occur? Is the user notified?

Test Identifier Summary

NT-082-02 When an application closes it must close the user Bluetooth connection that was established.

UTC_1_4.doc

Version 1.4

Page 17 of 18

7 Revision History Version V1.0 V1.1 V1.2

Date 24 November 2003 5 February 2004 25 February 2004

Name All All All

V1.3

20 May 2004

All

V1.4

28 September 2004

UTC_1_4.doc

Reason Version 1.0 Modifications made to testing workflow. Removed Test Process Chapter Reformated Document Modified: UI4, UI5, UI10, FN2, SE1 Added: FN0 Deleted: NT1 Moved: UI-118-1,2,3 to SE-118-1,2,3 Modified: UI19, LO1, LO2, UI3 and FN2.

Version 1.4

Page 18 of 18

Related Documents

Unified Test Criteria
October 2019 17
Dutch Test Criteria
December 2019 14
Criteria
May 2020 40
Criteria
June 2020 37
Criteria
November 2019 43
Unified Proposal
November 2019 11

More Documents from ""

J2me Interview Questions
October 2019 7
First Steps With Struts En
October 2019 11
Utc 2 2
October 2019 17
Tns Test Plan 1 6
October 2019 7