Cluster_acceptance_guidelines__for_lte_v2.1.docx

  • Uploaded by: Arun Srivastava
  • 0
  • 0
  • July 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Cluster_acceptance_guidelines__for_lte_v2.1.docx as PDF for free.

More details

  • Words: 12,551
  • Pages: 44
13-05-2014 – D438013136- 1 / 44 For internal Use

LTE Cluster Acceptance Guidelines FDD/TDD Release Independent

☐Regulated document ☐Unregulated document Version, status

0.1, Draft/Approved

Date

13-05-2014

Creator

Lorena Serna

Owner

Lorena Serna

Function

NPO Guidelines

Approver Doc ID Doc Location

D438013136 https://sharenet-ims.inside.nokiasiemensnetworks.com/Overview/D438013136

Change History Version

Status

Date

Handled by

Comments

0.0

Draft

25-08-2009

Lorena Serna

Initial version

0.1

Approved

18-09-2009

Lorena Serna

Coverage threshold section

1.0

Approved

25-06-2010

Lorena Serna

General update

1.1

Approved

29-12-2010

Lorena Serna

General update

1.2

Approved

30-06-2011

Lorena Serna

General update

1.3

Approved

12-07-2011

Gerson Mann

General update

1.4

Approved

30-11-2011

Gerson Mann

General update

2.0

Released

11-12-2013

Lorena Serna

Overall update. Removal of outdated sections. Addition of new sections. NSN rebranding. Addition of test cases and reports from projects.

2.1

Released

company.nokia.com

14-05-2013

Lorena Serna

Nokia rebranding. Corrections in Sections 4.4.6.

© Nokia 2014

13-05-2014 – D438013136- 2 / 44 For internal Use

This material, including documentation and any related computer programs, is protected by copyright controlled by NOKIA. All rights are reserved. Copying, including reproducing, storing, adapting or translating, any or all of this material requires the prior written consent of NOKIA. This material also contains confidential information, which may not be disclosed to others without the prior written consent of NOKIA.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 3 / 44 For internal Use

Contents 1

Scope of the Document ............................................................................................................ 5

2

Cluster Acceptance Process ..................................................................................................... 6

2.1

Overview ................................................................................................................................. 6

2.2

Preparation Phase .................................................................................................................... 7

2.2.1

Project Setup Phase .................................................................................................................. 7

2.2.2

Acceptance Setup Phase........................................................................................................... 8

2.3

Measurement Phase ................................................................................................................ 8

2.4

Optimization Phase ................................................................................................................. 8

2.5

Reporting Phase ...................................................................................................................... 8

2.6

Handling of deviations ............................................................................................................ 9

2.7

Exit Criteria............................................................................................................................. 9

3

Preparation Phase ..................................................................................................................... 11

3.1

Pre-requisites........................................................................................................................... 11

3.2

Network Assessment ............................................................................................................... 11

3.2.1

Basic Checks ............................................................................................................................ 12

3.3

Cluster Definition.................................................................................................................... 13

3.3.1

Cluster Border and Measurement Validity .............................................................................. 13

3.4

Drive Route Definition............................................................................................................ 13

3.5

Coverage Thresholds .............................................................................................................. 15

4

Measurement Phase ................................................................................................................. 18

4.1

FMT: Test Equipment ............................................................................................................. 18

4.1.1

Scanners ................................................................................................................................... 19

4.1.2

Drive Test Tools and Terminals............................................................................................... 20

4.2

Service Testing and FMT ........................................................................................................ 21

4.3

Testing Methods...................................................................................................................... 22

4.4

Field Test Cases for Acceptance ............................................................................................. 25

4.4.1

FTP Upload/Download ............................................................................................................ 25

4.4.2

User Plane Latency (RTT) ....................................................................................................... 27

4.4.3

Mobility Test cases .................................................................................................................. 28

4.4.4

CSFB (CS Fallback) ................................................................................................................ 29

4.4.5

Link Budget Verification ......................................................................................................... 30

4.4.6

Drive Test configurations for Acceptance ............................................................................... 31

4.5

Single Site Acceptance............................................................................................................ 33

5

Optimization Phase .................................................................................................................. 34

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 4 / 44 For internal Use

5.1

Post-processing tools .............................................................................................................. 34

5.2

KPIs......................................................................................................................................... 35

5.2.1

Coverage KPIs ......................................................................................................................... 35

5.2.2

Service Level KPIs .................................................................................................................. 36

5.2.3

OSS KPIs ................................................................................................................................. 36

5.3

Data Post-processing Considerations ...................................................................................... 37

5.3.1

Accessibility............................................................................................................................. 37

5.3.2

Throughput............................................................................................................................... 38

5.3.3

Reliability................................................................................................................................. 38

5.3.4

Round Trip Time...................................................................................................................... 38

5.4

Corrective actions ................................................................................................................... 39

6

Reporting.................................................................................................................................. 40

7

References ................................................................................................................................ 42

8

Glossary ................................................................................................................................... 42

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 5 / 44 For internal Use

1 Scope of the Document The scope of this guideline is to provide NPO radio network planners with a document that describes the different steps of the cluster acceptance process, the tasks involved and guidance about how to perform those tasks. This document is a guideline and as such, the processes described are generic. Detailed implementation will vary from project to project depending on the scope and share of responsibilities between NOKIA and the customers. Section 2 describes the overall acceptance process with its main phases. The acceptance process is common to previous technologies (i.e. 2G, 3G). Section 0 covers the Preparation Phase, a mix of common process topics (drive test and cluster definition) and technology specific (basic checks and coverage thresholds) Section 0 covers the Measurement Phase from information about measurement tools to guidance on acceptance tests. Section 0 provides information on post-processing tools and KPIs required during the Optimization Phase. Section 0 is dedicated to Reporting with a collection of report examples from live projects. Contents of Sections 0 to 0 are LTE specific. During trials or early projects the scope of the acceptance is wider, e.g. cases to be tested in the lab, in the field, under different loads and radio conditions, etc. The scope of this document is the pre-launch cluster acceptance of a commercial network. The cases for the acceptance are limited to field cases (mainly based on drive testing), they are done under no load unless otherwise agreed with the customer and they focus on the performance of the network from the end user point of view instead in the verification of features or functionalities such as scheduler, admission control, power control, etc. Test cases to verify features/functionalities are available per release in the ATMN [4]. It is not within the scope of the document to describe the tools used during the acceptance process (e.g. FMT tools and post-processing tools) although links to additional sources of information are provided in Section 0 This document will be updated at regular intervals to ensure that it considers the latest information. In order to ensure this, it is encouraged to provide feedback from NOKIA NPO projects. Please, provide feedback to [email protected]. Additional sections can be requested at any time and it will be included as and when appropiate.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 6 / 44 For internal Use

2 Cluster Acceptance Process 2.1 Overview This document describes the cluster acceptance process from the radio point of view. The considered scenario is an LTE network with operational sites but still not commercially launched. The overall process is not different from the cluster acceptance process in other mobile networks (e.g. 3G). The purpose of the cluster acceptance is to perform RF design verification, identify and categorize network, coverage and quality problems and to optimize for best cluster performance according to the acceptance performance criteria, while the rollout is continuing in other parts of the network. During the acceptance process the performance of the network is tested against the acceptance criteria. This will consist of a set of KPIs that have previously been agreed with the customer. The acceptance criteria will be verified by means of field drive test measurements. The cluster/network acceptance is the step between the planning and the ‘commercial’ launch of a network. It is usually performed at two different levels: cluster and network level. A cluster will cover a smaller geographical region. The number of sites per cluster will vary according to different conditions such as the geographical location of sites. The main idea behind the cluster concept is manageability. It is also more efficient to focus the acceptance procedure in smaller regions before passing to the network wide acceptance process. Once the acceptance is carried out cluster per cluster it is possible to pass to the second level, that is: the network acceptance where the network acceptance tests are completed after verifying the network status and doing some optimization if necessary. Similarly to the cluster acceptance the purpose of the network acceptance tests is to perform a complete RF design verification, identify and categorize network, coverage, capacity and quality problems and to optimize for best cluster performance according to the acceptance performance criteria set in the tender agreement. Depending on the size of the network, the acceptance can be divided into smaller areas in order to facilitate the tests. In this case network acceptance is passed when all the area acceptance tests are passed.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 7 / 44 For internal Use

Figure 1: Cluster acceptance vs. Network Acceptance Figure 1 describes, on a high level, the Cluster and Network Acceptance Process. This process can be divided into four main phases:  Preparation  Measurement Phase  Optimization Phase  Reporting Phase

2.2 Preparation Phase The preparation phase can be divided in two sub-phases:

2.2.1

Project Setup Phase

The network planning work has been completed and the LTE sites previously commissioned and integrated and are fully functional. Main steps of this phase: Define/identify the different clusters in which the network will be subdivided for the pre-launch optimization tasks. Define the drive routes that will be used to check the performance of the network. It is necessary to check that sites are on air and that the data-build for each is consistent with what it was planned and agreed with the customer prior to the measurement phase. If that is not the case, corrective actions need to be taken.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 8 / 44 For internal Use

Define the Scope of Work, align the estimated work load in terms of Man Working Days with the available financial budget. A basic, standard Scope of Work for “Cluster Acceptance” may be limited to the analysis of the: Radio coverage Handover success rate Throughput The introduction or the testing of additional software features is not in the scope of basic cluster acceptance and require the allocation of additional resources.

2.2.2

Acceptance Setup Phase

It basically refers to an Assessment of the network to ensure network is ready for acceptance. Assessment consists on diverse checks (site/clusters/network).

2.3 Measurement Phase    

Initial or iterative cluster/network drive test measurement campaigns and using a Field Measurement Tool (FMT) Drive testing will be performed randomly within the LTE network coverage. Measurements will be collected in good and poor radio conditions with vehicle speeds reasonable for the type of road. Collection of Performance Management (PM) counters is another measurement method. However, since the scope of this document is the acceptance of a network before the commercial launch it is assumed that the volume of PM data is not enough to be statistically relevant.

2.4 Optimization Phase 

Analysis of measurements: If the analysis of the measurements shows that KPIs requirements are not fulfilled, it is necessary to investigate the reasons and correct them by defining and implementing necessary changes to the network (e.g. tilts, azimuths and parameters) based on the results of the measurement analysis.

Before repeating the drive test measurements it is recommended to check again that the considered sites are on air and running with no faults.

2.5 Reporting Phase Once the drive test performance KPIs are passed the final cluster acceptance report is created and presented to the customer for acceptance. Cluster sizes, cluster definitions (which particular sites) and drive routes need to be agreed before hand with the customer. Same applies to the KPIs and KPIs targets against which to validate the cluster/network.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 9 / 44 For internal Use

The number of iterations (drive test – analysis – implementation of changes) depends on different factors like the initial status of the network, time constraints and costs. As a general rule, it is recommended to limit the number of drive tests in the pre-optimization phase to no more than 2 drives being a 3rd drive test only performed as an exception. However, this rule will be project dependent. Normally, cluster and network acceptances are performed with Field Measurement Tools (FMT) under unloaded condition. The only load during the field test measurements should be the one induced by the test UEs used in the measurement. In specific cases, customer may request testing under loaded conditions. DL inter-cell interference can be generated with feature LTE819. Uplink load generation can be done by restricting the maximum number of PRBs per real user through O&M parameter in the frequency domain scheduler for the victim cell such that the desired throughput for the victim is still achievable. The acceptance test procedures are the same for cluster and network acceptance although there are some differences on the practical side, in order to adapt them to the bigger geographical areas in the case of network acceptance (i.e. less detailed drive test routes that cross several clusters).

2.6 Handling of deviations In a rollout network deployment there will be situations that will affect the acceptance of the network. Situations where sites are not ready at the time of the acceptance or where sites are not available in the nominal location could be examples of deviations. How to deal with those cases (responsibility, penalties and way forward…) needs to be agreed with the customer and documented in the contract. In some cases the acceptance requirements can be reviewed and changed during the acceptance if mutually agreed between NOKIA and the customer. The final duration of the acceptance period will be defined in the agreed time plan.

2.7 Exit Criteria The measurement data shall be analyzed with pre-defined post-processing tools and compared against the defined target. If any problem is found, corrective action, either by parameter changes or physical changes (e.g. antenna tilting, azimuth change, etc.) will be performed to improve the network performance. Measurements can be conducted again after the corrective action, only around the problem area. After the problem is rectified and the performance target is met, a cluster acceptance certificate/report shall be issued and signed by both parties. Depending on the agreement with the customer it may be possible to discard samples due to e.g. traffic jams, traffic lights and other factors that can affect the KPIs. The drive test data can be averaged over the entire drive test if more than one route is driven in the network. Should averaged results meet the performance level specified, it will not be required to re-test neither a single drive route that does not fulfill certain KPI nor the entire drive test route. The customer may issue the Final Acceptance for the network after the area has passed the criteria for the area acceptance. However, there will be some cases where the Final Acceptance of a network is granted only after a so-called stability period that shall elapse after the network is on air and carrying commercial traffic. In such cases, the network acceptance is based also on counters and not only on drive tests KPIs.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 10 / 44 For internal Use

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 11 / 44 For internal Use

3 Preparation Phase 3.1 Pre-requisites This section presents some pre-requisites to ensure a smooth acceptance process:  The amount of KPIs in which the acceptance will be based, its definition and the target/minimum values for those KPIs must be clearly documented and agreed with the customer before the acceptance process starts (during the design phase). 

If sites are not acquired due to difficulties in site acquisition, and there is no alternative for that, the KPI calculation shall exclude the measurement results in the affected area.



Site and equipment installation acceptance have been completed for the areas to be tested.



Availability and capability of measurement tools and terminals is a pre-requisite to carry out the acceptance test. NOKIA will normally provide and use a 3rd party measurement tool. In case there are KPIs that cannot be measured by existing measurement tool or terminal, these KPIs will be excluded from acceptance test or they will be reviewed or replaced by other KPIs. The customer shall be responsible for the provision of test servers (e.g. FTP server) and applications for each service for the purpose of the acceptance tests to be carried out.





Measurement data that might have been affected by events and/or interferences not attributable to Nokia (e.g. poor performance of the test servers) shall be excluded from the processing of data of the respective network planning service acceptance test and shall have no impact on the final test result.



Minimum RF conditions shall be defined and met during the drive test. Data that does not meet those conditions should be excluded.



In order to have a radio test environment that can be considered as being controlled, it is assumed that no other terminals than those used for the performance evaluation are active in the test cluster. Logbooks for network elements are useful tools for reproducing strange effects recorded during measurements.

3.2 Network Assessment Even though the main targets of pre-launch optimization are coverage, quality and dominance areas, it is important to start with a network assessment covering the HW configuration and the parameter configuration to exclude factors that can affect the network performance. It is recommended to compare the parameter baseline in the network against NPO parameter recommendations (plan verification) and to compare the operator’s ‘actual’ network configuration data against its baseline or alternatively against NPO recommended configuration (parameter consistency). Additional assessment related information can be found in NPO Global Assessment Wiki in SharePoint.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 12 / 44 For internal Use

3.2.1

Basic Checks

It is highly recommended to check the following, independently of the available tools: PCI Planning Check: Physical Layer Cell Identity (PCI) Plan should be verified as part of the initial health check. In particular, it should be avoided that two neighbour cells have the same PCI and that the cells belonging to the same eNodeB are allocated identities from within the same group. The latest can be avoided by ensuring the cells of an eNodeB have different modulo3 (PCI). If PCI planning was handled correctly by NetAct Optimizer then there should be no need for additional checks. It is worth mentioning that the implementationof features that increase the coverage after the initial PCI planning has been done (e.g. DL RS power boosting, RL30), may require a ‘PCI re-tuning’ in the network. More information about PCI planning can be found in [2]. Missing neighbors: One of the differences of LTE respect previous technologies (i.e. 2G, 3G) is that LTE may not require neighbour planning as the UE is responsible for identifying adjacencies prior to completing a handover. In this sense, it may not be necessary in all cases to check LTE - LTE neibhour list related issues like missing, undefined or one-way defined neighbours. This automatic neighbour planning is covered by the SON ANR features. However, ANR features (up to RL60/RL45TD) do not include a mechanism for automatic deletion of neighbour relations and the optimization of neighbour relations is necessary. This has been observed in live networks. In case neighbour relations are defined by the planner it is recommended to check for potential neighbour relations issues. MHA/feeders configuration: Although NOKIA’s preferred solution for eNodeB deployment is the feederless solution (no need for feeders and MHA) an existing operator could prefer to re-use the existing configuration (e.g. feeders and MHAs). In this case, it is necessary to check that the feeder and MHA data is correct. Cases of X-feeders are still common in LTE networks. Wrong parameter planning: Certain areas like PRACH and UL Demodulation RS need to be planned following the latest recommendations to avoid unnecessary impact on the performance (i.e. low throughputs). It is advisable to check for planning recommendations in [2] and [8]. Check that parameters in the eNodeB are according to planned parameters It has been noted that in some cases changes done in the commissioning file are lost or not reflected in the network (actual file). E.g. Remote electrical tilts. Handover Parameter Check Part of the assessment, it is recommended to check handover thresholds, offsets to ensure smooth mobility performance before drive testing. Features check Be aware of the features that are activated, if license is available and if they are correctly configured.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 13 / 44 For internal Use

3.3 Cluster Definition There are different opinions about the ideal cluster size. In any case, clusters shall be defined by NOKIA and agreed with the customer. As a guide, following criteria can be considered when defining clusters for acceptance: The larger the cluster the more stable are the results, the more cost efficient is the acceptance/optimization process and the more RF changes impact is assessed properly. However, the cluster should stay on manageable and processing size. Cluster sizes between 10 and 20 sites geographically adjacent are appropriate in terms of allowing the definition of realistic drive routes which cover the area in detail within a reasonable time. It also represents a good compromise between cost and efficiency. All sites in the cluster must have the site acceptance completed before the measurements. Neighboring clusters should be independent from an interference perspective. That is, sites belonging to a cluster should interfere as little as possible with the sites in the neighboring cluster. In this line, cluster buffer zones are defined. A buffer zone consists of the cells that contribute with measurable interference to the cluster. Typically, this will correspond to at least two tiers of sites from adjacent clusters. All cells on-air in the buffer zone pointing to the cluster shall be optimized at the same time as the cluster. For efficient optimization, at least 90% of the sites in a cluster should be available at the start of optimization. Therefore, clusters should be defined and sites within the cluster prioritized as early as possible within the rollout process When defining a cluster for acceptance, the cluster should exclude missing sites (e.g. late integration) and its immediate neighboring sites. When a delayed site is completed acceptance of this site and its neighboring sites should be conducted by defining a cluster covering the site and its adjacencies.

3.3.1

Cluster Border and Measurement Validity

Drive routes shall be defined to cross the edges of the clusters to ensure that the edge of LTE coverage is crossed whilst drive testing. This will provide a clear coverage footprint of the border sites and the extent of the service provision e.g. towards 2G/3G networks if that is the operator’s case. Coverage itself is also statistical in nature and the coverage measured by drive tests is related to the outdoor coverage probability.

3.4 Drive Route Definition The drive route definition for the acceptance tests shall follow the same rules as for other wireless access technologies (e.g. GSM, 3G). Drive routes are normally defined by NOKIA or third party engineers and agreed by the customer.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 14 / 44 For internal Use

Main general rules for drive test definition are:  



All sectors in every site in the cluster should be covered by the drive route. Drive routes shall be defined to cross the edges of LTE coverage to ensure that the edge of coverage is crossed whilst drive testing and allowing testing for inter system HOs or RRC Connection Release with redirection if that is the case. Routes should cover: o o o o

Key business centers, shopping centers, high school/office areas, tourist attractions and railway stations. Major roads: Motorways & A roads. Other specific roads can be added to the drive route where there are unique terrain issues that need to be tested or they are simply considered important. Tracking Area borders should also be covered as part of the drive route.



It is important in city areas that a fair assessment of coverage is provided by including sample side streets where coverage may be much lower than the main thoroughfares.



The routes should be laid out to gain a clear footprint of each cell in a way that close-in problems such as low transmit power as well as far-off problems such as spill-over can be observed.



The ratio between the length of the route and the cluster area shall be approximately constant among clusters with the same clutter characteristics.



Routes should avoid roads with restricted access, pedestrian areas or secure areas or those requiring special access conditions.



Routes should avoid exclusion areas if this information is provided by the operator (i.e. already known areas of no coverage). Otherwise, after the initial drive test these areas must be agreed with the customer and documented.



The route is feasible within the timescales allocated. Generally this is within a working day, taking in consideration the preparation time and data consolidation at the end of each day. Therefore the drive time is unlikely to exceed 7 hours. Busy roads should be avoided during peak traffic times.





In case it is necessary to re-drive the routes (i.e. KPIs are not passed after the initial drive) the same route must be driven following the same directions to reproduce the same original conditions.

Network acceptance drive routes will follow the same principles although there can be variations due to the wider geographical area (e.g. less dense routes) and to the fact that the cluster acceptance tests have been passed previously so they don’t need to be so exhaustive. The drive route for the Network acceptance shall cover several clusters. company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 15 / 44 For internal Use

Figure 2 shows an example of how to define the cluster measurement route and network measurement route:

Figure 2: Example of Cluster and Network acceptance routes concept

As previously mentioned drive routes are defined by NOKIA or third party engineers and agreed by the customer. To facilitate the handling of these routes they shall be created in electronic format for which it will be necessary to have accurate up-to date digital maps of the area including different types of roads.

3.5 Coverage Thresholds The coverage criteria or coverage KPIs for a network in terms of RSRP is based on the design criteria, i.e. in the link budget calculations for the selected reference service and as per agreement with the customer. That is the reason why the values to be used are case specific and may vary considerably. LTE terminals measure from the service cell the RSRP and RSRQ. However, it is the SINR the value used for the system simulations in dimensioning so SINR values need to be mapped to RSRP and RSRQ values. A presentation covering the mapping of these values can be found in the RF measurement Quantities section in [7].The full study was conducted and verified in the lab where it is possible to set certain load values but that it is not the case in field measurements as load cannot be set with a fixed value. Therefore, although RSRP results are reliable, RSRQ results are not. Another way of calculating the RSRP thresholds for a certain service based on the Dimensioning Tool (Link Budget for certain service) is presented with the example below, but before it is necessary to establish the relationship between RSSI and RSRP: RSRP is defined (3GPP definition) as the average of power levels received across all Reference Signal symbols within the considered measurement frequency bandwidth. Since it is the average, it corresponds to the received power of 1 RE.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 16 / 44 For internal Use

RSSI is the received wideband power, so the received serving cell power plus noise plus interference power across the whole bandwidth considered. Bandwidth is defined by N number of RBs and each RB contains 12 RE. Under full load and high SNR it can be considered: RSSI= 12*N*RSRP where N is the number of RBs considered Representing the above equation into log terms: RSRP (dBm) = RSSI (dBm) - 10*log (12*N)

Table 1: Scaling Factor between RSSP and RSRQ for different Bandwidths Additionally, it can be considered the received signal (RSSI) at the cell edge under maximum TX power (the case of the Link Budget) as the Receiver Sensitivity. Example of thresholds calculation based on certain cell edge service: Parameters used in the Link Budget:  Required cell edge throughput 1Mbps in DL and 384kbps in UL  Cell maximum TX power per antenna 44.8 dBm  4Tx Adaptive CL MIMO used  10MHz carrier bandwidth  22dBi BTS antenna gain  0.4dB jumper cable loss With those inputs, using the dimensioning tool (Link Budget) the obtained Receiver Sensitivity is: DL Rx sensitivity: -106 dBm Based on all the above: RSRP (dBm) = Rx sensitivity – 10*log (12*N) + DeltaMAPL+ Other Margins DeltaMAPL: To account for the UL-DL unbalance: MAPL0 DL- MAPL0 UL Other margins to be considered depending on the case: Log Normal Fading, Gain against Shadowing, Building Penetration Loss…  Log Normal Fading: 8.4dB  Gain against Shadowing: 2.7 dB  Building Penetration loss: 22dB Following the example:  Coverage threshold without LNF margin, Gain against Shadowing and BPL: RSRP (dBm) = -106 dBm – 27.78dB + (151.65-135.42) = -117.55 dBm

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 17 / 44 For internal Use

 Coverage threshold with LNF margin and Gain Against Shadowing: RSRP (dBm) = -106 dBm – 27.78dB + (151.65-135.42) + (8.4 – 2.7) = -111.85 dBm  Coverage threshold with LNF margin, Gain Against Shadowing and BPL: RSRP (dBm) = -106 dBm – 27.78dB + (151.65-135.42) + (8.4 – 2.7) +22 = -89.85 dBm Note: These calculations are estimations based on the dimensioning tool.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 18 / 44 For internal Use

4 Measurement Phase Once the cluster, drive routes and coverage thresholds are defined and a health check of the driven eNodeBs has been done (alarms, configuration, data-build), it is possible to start with the drive test campaigns. Depending on the design criteria and agreed KPIs stated in the contract, the relevant bearer service(s) will be measured. If the initial set of measurements is satisfactory and already passes the KPIs it is not necessary to effectuate any more drives and the cluster acceptance report can be prepared for customer approval. If KPIs are not achieved after the initial drive test (usual case) then, it is necessary to optimize the cluster and perform more drive tests before generating the cluster acceptance report. The acceptance tests shall be conducted in a ‘frozen’ area of the network. A frozen area is an area ready and prepared for acceptance or undergoing acceptance tests. Any changes affecting the design or performance characteristics cannot be made to a frozen area unless mutually agreed upon between customer and NOKIA. Third party companies are normally in charge of the drive test campaigns. They will provide the measurement files to NOKIA for further analysis. Time for the selection of the third party company and for the organization of the task must be taken into account within the project. Examples of field test cases for acceptance are presented in Section Error! Reference source not found..

4.1 FMT: Test Equipment The acceptance tests are carried out by means of Field Measurement Tools (FMT). These tools monitor and measure the performance of the air interface of the LTE network allowing to detect problem areas (e.g. poor quality) and to test the different services. The result of the analysis of the collected measurements will be a set of KPIs that will show, amongst others, the signal strength, signal quality and the performance associated to the different Radio Bearers used to support the different services FMT systems support different terminals (e.g. test mobiles, dongles and/or scanning receivers). It also consists, as in other wireless technologies, of a GPS receiver that allows the graphical display of the measurements and a PC with the measurement software. Reliability of FMT is strongly dependant on the laptop specifications. Low end laptops are likely to experience crashes and therefore not recommended. FMT application vendor recommendations about minimum HW requirements should be followed. The usage of external antennas or not will be agreed with the customer beforehand, although common preference is not to use external antennas. NOKIA has a list of ‘preferred tools’ and in some cases, the customer may indicate a preference for certain tools/terminals.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 19 / 44 For internal Use

Figure 3: Sample configuration of FMT The market for Field Measurement Equipment has developed rapidly. Detailed information on NOKIA recommended Drive Test tools, Terminals, Scanners, NPO contacts, supplier contacts, frame agreement and pricing conditions, project survey showing the DT tools and scanners most commonly used in NOKIA projects can be found in LTE Drive Test Tools Overview and in the Tool Section in SharePoint. Sections below provide a brief introduction to those tools that have been most popular among network operators and NOKIA engineers.

4.1.1

Scanners

Popular scanner solutions within NOKIA are JDSU Scanner, Rhode & Schwarz and PCTel Scanners. JDSU and PCTEL are widely used with positive feedback. Anite also offers its own scanner Nemo FSR1 for Nemo Outdoor kits. A study comparing different scanners can be found in [9]. Additional scanner information is available in the following links: Comparison between different PCTel Scanners Summary of Improvements in the W1314B vs. W1314A (JDSU scanner). W1314B is the updated version. In general, scanners support simultaneous multi-technology / multi-band measurements. The expected scanner measurements are:  P-SCH RSSI  S-SCH RSSI  LTE physical Layer Cell-ID  Reference Signal RSRP  Reference Signal RSRQ  Reference Signal SINR and/or S-SCH SINR Dominance plots can be based on the LTE Physical Layer Cell-ID since the allocation of physical layer cell identities is analogous to scrambling code planning for UMTS. It should be possible to perform an early tuning of the network (e.g. during pre-launch optimization) based only in the signal strength by detecting areas of poor coverage or dominance (interference). In any case, scanner measurements are always complementary to the field measurement tool (FMT) measurements when doing the acceptance of a network. Figure 4 shows the measurement visualization of the JDSU Scanner (W1314A RF, ex-Agilent)

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 20 / 44 For internal Use

Figure 4: Example of JDSU LTE Scanner measurements (W1314A RF)

4.1.2

Drive Test Tools and Terminals

4.1.2.1 Drive Test Tools Most popular drive test tools used in the field are: XCAL (Accuver), NiXt (JDSU), Nemo Outdoor (Anite) and TEMS.  

 

Nemo Outdoor (Anite) supports LG, Altair and Qualcomm chipsets for FDD and TDD. It is compatible with PCTEL and Rhode & Schwarz (R&S) scanners. XCAL (Accuver) is probably the most popular at the moment, supporting Samsung, LGE and Qualcomm chipsets amongst others and is also compatible with PCTEL and R&S scanners. It logs, decodes and filters L1, L2 and L3 information and packet data. Parameters are reported with one second granularity. NiXt E6474A (JDSU) for FDD and TDD. It also supports LG, Samsung and Qualcomm. TEMS Investigation (Ascom): popular amongst operators.

The following list shows (not exhaustively) the LTE parameters/information supported: Layer1 Cell info, Channel Info, CQI, Serving cell RSRP, RSRQ, UE Tx power, RACH Info PDSCH throughput, PDSCH BLER, PBCH BLER, DL grant, UL Ack/Nack status, PUSCH PHY throughput, UL Scheduling status, UL grant, DL Ack/Nack status. Layer2, 3 PDCP Info (pdu, throughput, configuration, security), RLC Info (status, throughput), MAC Info (pdu, throughput, HARQ Ack/Nack info, BLER, TA), RRC message, NAS message.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 21 / 44 For internal Use

4.1.2.2 Terminals There are currently many commercial LTE certified devices. Factors that influence the terminal selection are: the customer preference, the availability for certain bandwidth or if the terminal is supported by the drive test tool. Detailed information on what terminals (FDD and TDD) are supported by the most popular drive test tools within NOKIA is available in the ‘LTE Devices xx’ workbooks of LTE Drive Test tools overview. Additional information on LTE terminals for testing purposes, pricing and ordering can be found in the LTE TIP, the Terminal information portal for LTE.

4.2 Service Testing and FMT Cluster acceptance guidelines focus on the pre-launch optimization of a commercial network. Acceptance testing is based on field testing (mainly drive test) using different applications or services. Current guidelines focus on FTP, and Ping being FTP the most popular. Other applications such as HTTP and video streaming may be considered in future updates if there is a demand for them during the cluster acceptance. To test data calls, the FMT needs to be equipped with a data testing system. The data testing system comprises of a data server that automatically answers calls made by a FMT, and the respective client software in the FMT. Dependent on the configuration of the FMT, the data server either sends or receives measurement data (packets) to or from the client software in the FMT. The data server is loaded with the appropriate test applications (e.g. FTP, streaming, VoIP). To be able to make automatic calls, the FMT records the actions in a script file and acts in accordance with it. In order to eliminate the effect of delay due to the internet, the application servers shall be located within the LTE core network under test, close to the Serving/PDN (SAE) Gateway. Dedicated FTP server is required for data testing. Laptop equipment should have optimal settings to be able to achieve maximum throughput e.g. it is important to ensure that the TCP window size in the PC is large enough to not limit the tests performance. In case it is required to measure the Peak User Rate this should be measured in stationary conditions. Although UDP streaming testing can provide more accurate view about end-to-end performance of different network elements performance testing is typically done with FTP download and upload. For FTP application testing a compressed file is used for upload as well as download. File transfer duration of one minute or more is recommended for the test with the transfer being executed in binary mode. Example of logs that should be recorded during the tests:      

FTP logs with average user throughput UE logs with: L1 throughput SINR CQI RSRP

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 22 / 44 For internal Use

   

RSSI MCS GPS log (coordinates) Scanner logs: o LTE physical Layer Cell-ID o Reference Signal RSRP o Reference Signal RSRQ o SINR (Reference Signal or S-CH, depending on the scanner)

Figure 5: Example of test configuration

4.3 Testing Methods Three testing methods can be defined based on the EMM/ECM states. The usage of one or another testing method will have an impact in the measurable KPIs. Method 1: Sessions start from EMM-DEREGISTERED state Each session within the drive test script starts from EMM-DEREGISTERED state. Testing includes: Attach- Application Data- Detach. It is possible to include several data applications in the same session (FTP DL/UL, Ping). This method of testing allows to measure success rates/ times: Attach, E-RAB setup, RRC setup, Application setup and HO. As a drawback, it doesn’t reflect the end user behavior as users typically start sessions from EMMREGISTERED state. Example script (to be repeated the duration of the drive test route):  Attach company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 23 / 44 For internal Use

   

FTP DL (5MB) FTP UL (3MB) Ping 32-1024B (x10) Detach

Example below shows how L3 messaging looks when starting from EMM-DEREGISTERED state.

sesssion1

Session 1 stop

Time 13:56:34.094 13:56:34.094 13:56:34.097 13:56:34.127 13:56:34.132 13:56:34.366 13:56:34.368 13:56:34.369 13:56:34.390 13:56:34.410 13:56:34.414 13:56:34.416 13:56:34.416 13:56:34.424 13:56:34.458 13:56:34.458 13:56:34.459 13:56:35.464 13:56:56.606 13:56:56.607 13:56:56.662 13:56:56.663 13:56:56.665 13:56:57.142 13:56:57.142

Subchannel Direction Uplink Uplink CCCH Uplink CCCH Downlink DCCH Uplink DCCH Downlink DCCH Uplink DCCH Downlink DCCH Uplink DCCH Downlink DCCH Uplink Downlink Downlink DCCH Uplink Uplink Uplink DCCH Uplink DCCH Uplink Uplink DCCH Uplink DCCH Downlink DCCH Downlink Downlink BCCH_BCH Downlink BCCH Downlink

Message ATTACH_REQUEST PDN_CONNECTIVITY_REQUEST RRCConnectionRequest RRCConnectionSetup RRCConnectionSetupComplete SecurityModeCommand SecurityModeComplete UECapabilityEnquiry UECapabilityInformation RRCConnectionReconfiguration RRCConnectionReconfigurationComplete ATTACH_ACCEPT ACTIVATE_DEFAULT_EPS_BEARER_CONTEXT_REQUEST MeasurementReport ATTACH_COMPLETE ACTIVATE_DEFAULT_EPS_BEARER_CONTEXT_ACCEPT ULInformationTransfer MeasurementReport DETACH_REQUEST ULInformationTransfer DLInformationTransfer RRCConnectionRelease DETACH_ACCEPT SYSTEM_INFORMATION_BCH MASTER_INFORMATION_BLOCK

Method 2: Sessions start from ECM-IDLE EMM-REGISTERED state Each session starts from ECM-IDLE, EMM-REGISTERED state. The sequence to be observed is: RRC setup – Application data – Wait- Release due to inactivity. Same as with the previous method, several data applications can be included in the same session (FTP DL/UL, Ping). It allows measuring Success Rates/times: Service Request, E-RAB setup, RRC setup, Application setup, HO. This testing method poses challenges in practical measurement setup, especially with inactivity timer >10sec. It requires specific firewall settings in the measurement laptop to block all outgoing packages during wait period. Example script (to be repeated the duration of the drive test route)     

RRC setup/service request FTP DL (5MB) FTP UL (3MB) Ping 32-1024B (x10) Wait until released by eNodeB

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 24 / 44 For internal Use

Example below shows how L3 messaging looks when starting from EMM-REGISTERED, ECM-IDLE state.

sesssion1

session2

session3

Time 14:25:52.537 14:25:52.540 14:25:52.574 14:25:52.576 14:25:52.577 14:25:52.581 14:25:52.594 14:25:53.634 14:26:06.540 14:26:17.957 14:26:18.021 14:26:18.023 14:26:18.059 14:26:18.061 14:26:18.061 14:26:18.066 14:26:18.077 14:26:19.117 14:26:57.044 ….

Subchannel CCCH DCCH DCCH DCCH DCCH DCCH DCCH DCCH DCCH CCCH CCCH DCCH DCCH DCCH DCCH DCCH DCCH DCCH DCCH

Direction Downlink Uplink Downlink Uplink Downlink Uplink Uplink Uplink Downlink Uplink Downlink Uplink Downlink Uplink Downlink Uplink Uplink Uplink Downlink

Message RRCConnectionSetup RRCConnectionSetupComplete SecurityModeCommand SecurityModeComplete RRCConnectionReconfiguration RRCConnectionReconfigurationComplete MeasurementReport MeasurementReport RRCConnectionRelease RRCConnectionRequest RRCConnectionSetup RRCConnectionSetupComplete SecurityModeCommand SecurityModeComplete RRCConnectionReconfiguration RRCConnectionReconfigurationComplete MeasurementReport MeasurementReport RRCConnectionRelease

Method 3: Sessions start from ECM-CONNECTED state Each session starts from ECM-CONNECTED, EMM-REGISTERED state. In this case, only application data is observed. Same as in the previous cases, several data applications can be included in the same session (FTP DL/UL, Ping). It is possible to measure Success Rate/time: Application setup, HO. Only mobility and reestablishment/drop related signaling is seen. Example script (to be repeated along the drive test route):    

FTP DL (5MB) FTP UL (3MB) HTTP page DL (x5) Ping 32-1024B (x10)

The table below shows a summary of the main KPIs that can be measured based on the testing methods presented above.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 25 / 44 For internal Use

Table 2: KPIs that can be measured with the different methods The recommended method for cluster acceptance is method 2: ECM-IDLE, EMM-REGISTERED as it is the one that closely matches the end user experience. Method 1 (EMM-DEREGISTERED) could be used is Attach success rates are mandatory as part of the acceptance KPIs.

4.4 Field Test Cases for Acceptance This section presents generic field test cases for acceptance. Several terminals can be connected into the FMT allowing testing different areas/applications simultaneously. Examples of test cases covering different aspects are included at the end of the section. The final acceptance test plan needs to be agreed with the customer so there could also be additional tests if they are required. All measurements, especially those measuring rare effects, need to meet enough number of samples. There are different approaches but minimum requirement for the drive tests is to have minimum a hundred valid samples in order to measure in ‘% unit’. The more the samples the more statistically reliable the KPIs will be. There are two ways to create a data stream in UL/DL: i. A large file can be uploaded / downloaded from a server ii. A continuous stream of dummy data can be created using the software JPerf/ iPerf. This is used to measure latency, throughput and jitter. Section below refers to the standard option of ftp file download/upload.

4.4.1

FTP Upload/Download

For the test, the UE first attaches to an eNodeB and EPC (state EMM_Registered and ECM_Connected) and default bearer is setup. UE has subscription information in the HSS with QCI=9 (non-GBR). After the network attach, the LTE terminal is ready to exchange data with the network and measurements can start. Transfer durations of 1 minute or more for DL and UL transfers are recommended so file sizes should be selected according to the bandwidth and known average DL/UL throughputs. Example below considers that available bandwidth is 20 MHz. It is common in trials and small clusters to use smaller file company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 26 / 44 For internal Use

sizes (e.g. 100MB for downlink and 50MB for uplink in 20MHz bandwidth case) as the file size also depends on the nature of the test (e.g. only FTP testing versus a mix application tests like FTP and ping together). As mentioned above, the scope is to gather enough samples during the drive test duration. FMT tools have logging capabilities for several terminals simultaneously, this allows for example to have two terminals making data calls in parallel. One terminal can effectuate short data calls to capture accessibility KPIs ( attach SR/times) and another terminal can effectuate long data calls to better capture DL/UL throughputs, drop rates and handover KPIs. IMPORTANT NOTE: The FTP server must be tested (e.g. in very good RF) prior to the start of the acceptance drives otherwise, results may not be valid affecting therefore the acceptance. The location of the application server should be in the Gi interface as this will directly impact latency and application throughput. If the maximum FTP throughput cannot be reached in very good RF conditions, then this issue must be troubleshot before actual drive testing. Testing with UDP download/upload in the same location can show if the throughput problem is due to RF/eNodeB issues or to core network/FTP server issues. For more troubleshooting recommendations check [8]. The TCP Window size may limit the maximum achievable throughput as TCP transmits data up to the window size before waiting for the acknowledgements so the full bandwidth of the network may not always get used. In case of LTE, since the bandwidth available can be much higher than in WCDMA or GSM the throughput effect of the common window sizes is higher. Note: High downlink data throughput can be observed, when the UE is located in good radio conditions. In medium- and poor radio conditions, more forward error correction (FER) is needed and a lower Modulation- and Coding Scheme (MCS) must be assigned. As a result the downlink data throughput is reduced. Good radio conditions can be provided for a larger number of users, if the eNodeB antenna systems are optimized in terms on antenna height, tilt and azimuth. Higher data rates for UL are possible since RL30/RL25TD with (LTE829) Increased uplink MCS range feature. Other planning parameters, for instance the usage of MIMO have an influence on the achievable throughput.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 27 / 44 For internal Use

Test Equipment

 

Method of testing

1. Set the FMT to generate calls to the FTP Server automatically with the following scrip/sequence: a. Connection attempt (network attach) b. Log into the FTP Server c. FTP Download 400MB file d. Wait10 sec. after session finishes e. FTP Upload 200MB file f. Log out the FTP Server g. Connection release (network detach) h. Repeat all previous steps (from ‘a’). 2. Start drive testing till the drive route is completed. 3. Start the FMT with the script defined in 1) 4. Stop the FMT at the end of the route or see note 1. 5. If the route does not generate more than 100 calls (the more calls, the more reliable the statistics), repeat the measurement on the same measurement route). 6. The FMT shall record the reference signal RSRP and SINR during the measurement.

FTP Data Server connected directly to SGi interface 1 LTE capable terminal and FMT

Note1: It is recommended to keep several files (i.e. about 1 hour duration) for the session. It facilitates handling of the files and it prevents missing considerable amounts of data in case of corrupted files. Laptop windows settings shall be optimized for high throughput. Analysis

4.4.2

The following KPIs should be calculated:  Average throughput for UL and DL [Mbps]  Completed Session Ratio: Upload/Download session success rate [%]  Accessibility: PS Data Call Success Rate [%]  HO Success Rate [%], HO Interruption Time  Accessibility: Network attach Success Rate [%], Attach Time [ms]

User Plane Latency (RTT)

End-to-end user plane delay is based on the measurement of the round trip time (RTT) of ping from UE to the IP host that is 1 hop away from the SAE-GW. Figure 5 shows the test configuration including the different interfaces involved: Uu, S1-U and SGi. Adaptive MIMO, if considered by customer, is implicitly involved. The one way delay, which can be more accurate to evaluate the user plane performance in both directions, is not included in this test scope, because it is difficult to reach high time synchronization between UE (and PC connected to it) and the IP host connected to EPC. Note that the EPC involved in this test should be NOKIA product. EPC from other vendors are subject for IOT test.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 28 / 44 For internal Use

Ping tests are normally done using stationary terminals under good radio conditions (good SINR). Another possibility is to include the ping testing within a drive test sequence (i.e. attach, FTP DL, FTP UL, ping, detach) although in that case good SINR cannot be guaranteed. Common packet size for ping is 32 bytes but customer may require other packet sizes e.g. 1400 bytes. This needs to be considered in the KPIs as the bigger the packet size the longer the RTT. Example of 100 continuously pings in DOS with packet size =1400 bytes C:\> ping xx.xx.xx.x –n 100 –l 1400 .

 

Test Equipment Method of testing

Application Server connected directly to SGi interface 1 LTE capable terminal and FMT (or application PC)

1) Verify DL SINR: Attach the UE to the cell under test (no load unless otherwise specified) to verify that the DL SINR of the selected location(s) is within the required range. See Table 6 2) Attach UE to the network so a radio bearer is established correctly and an IP address is assigned to the UE 3) Ping from UE to file server. Execute 100 continuous PING towards the file server or any other IP host 1 hop away from SAE-GW. Packet size = 32 bytes. Example of 100 continuously pings in DOS with packet size = 32 bytes: C:\ > ping xx.xxx.xx.x –n 100 4) Save the RTT shown in the screen performance

average and peak

5) Steps 3) and 4) shall be executed multiple times to ensure accuracy and consistency of the results. Analysis

The following KPIs should be calculated: 

4.4.3

E2e IP Latency: active mode RTT[ms]

Mobility Test cases

Handover testing implicitly happens while testing other applications along the drive route. Most common handovers are intra-LTE handovers. Depending on the characteristics of the operator network IRAT handover performance might be included in the acceptance. 4.4.3.1 Intra-LTE Handovers (X2, S1) Pre-conditions: 

UE status before measurement: EMM-REGISTERED and ECM-IDLE.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 29 / 44 For internal Use



UE is locked to LTE network. Test Equipment

1 LTE capable terminal and FMT

Method of testing

1. Set the FMT to generate calls to the FTP Server automatically with the following script/sequence: a) Connection attempt b) Log into the FTP server c) FTP Download d) After FTP download is completed, repeat FTP download (from ‘c’) 2. Start drive test and continue driving until drive route is completed.

Analysis

The following KPIs should be checked Intra- LTE Handover Success Ratio [%] Intra-LTE Handover Interruption time [ms]

4.4.3.2 IRAT Handovers Pre-conditions:   

UE status before measurement: EMM-REGISTERED and ECM-IDLE. UE is not locked to LTE network. Test is performed on the edge of LTE coverage where WCDMA coverage is present. Test Equipment



Method of testing

1. Set the FMT to generate calls to the FTP Server automatically with the following script/sequence:

1 LTE capable terminal and FMT

a) Connection attempt b) Log into the FTP server c) FTP Download 2. Start drive test and continue driving until UE is on the edge of LTE coverage and performing Inter-RAT Handover to UTRAN. 3. Drive (back) again until UE handover back to E-UTRAN Analysis

4.4.4

The following KPIs should be checked Inter-RAT Handover Success Rate [%] Redirection Time to eUTRAN [ms]

CSFB (CS Fallback)

CSFB cases are also included in the acceptance as LTE is usually deployed as modernization of an existing 3G and/or 2G networks.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 30 / 44 For internal Use

The CS Fallback functionality provides seamless voice service and enables the reuse of the existing CS core networks for voice for LTE subscribers. When voice services are initiated or terminated to or from the UE attached to the EPS, the network forces the UE to fall back to the GERAN/UTRAN network where the CS procedures are carried out. There are several CSFB features (e.g. via redirect, PS HO based, redirect with SIB, with deferred SIB) but acceptance testing procedure is the same. Proposed test is for the MOC (Mobile Originated Call) case. If agreed with customer MTC (Mobile Terminated Call Tests) can also be executed. Preconditions:  

UE-status before measurement is EMM-REGISTERED and ECM-IDLE UE is not locked to the LTE network

Test Equipment Method of testing



1 LTE capable terminal and FMT 1. Set the FMT to generate calls to test call number automatically with the following script/sequence: a. CS call attempt to test call number b. CS call for 25 sec c. Terminate CS call from UE (connection release) d. Leave 30 sec before starting the next session and ensure UE already reselects back to LTE e. Repeat CS call attempt (from ‘a’) 2. Start drive test and continue driving until drive route is completed.

Analysis

4.4.5

The following KPIs should be checked:  CSFB Call Setup Success Ratio [%]  CSFB Call Setup Time [s]

Link Budget Verification

The purpose of this test case is to verify that predefined customer cell edge throughput is achieved with the required SINR values on the cell edge. It is not expected to be carried on in all cells of a network as part of the cluster acceptance but it has been included for completeness. Calculation is based on propagation models like Okumura - Hata or Cost 231. The link budget is highly dependent on the UE performance. Suitable calculations / simulations of the link budget / cell edge performance based on concrete assumptions are to be offered separately and used as input for the test. The test is performed with a single UE measuring sector coverage in DL/UL with service (DL FTP, UL FTP) and interference 0%. Moving from RF conditions with good SINR towards cell edge and measuring the SINR. Once the required SINR (calculated from simulation for the predefined customer throughput) is reached perform a DL/UL FTP on that location and measure the DL/UL user throughput, SINR. Expected Result is Throughput vs. SINR. Optional: During the test, the

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 31 / 44 For internal Use

call could be continued until end of coverage is reached and the call drops. This allows verifying proper release of the call and release of the resources. Preconditions:    

UE is attached to the LTE cell (state EMM_Registered and ECM_Connected) HO thresholds are configured so that no HO is performed at the cell edge. Predefined route in tested cell from good SINR towards cell edge Required SINR for certain predefined cell edge throughput is calculated with the Dimensioning Tool RAN_Dim Test Equipment

Method of testing

Analysis

  

1 LTE capable terminal and FMT Iperf tool Dimensioning (Link Budget) Tool

1. Drive to the cell edge until the required SINR conditions are verified. 2. Start Iperf server (receiver) on the application PC (UE side for DL traffic, core side for UL traffic) specifying the transport protocol (TCP, UDP) and appropriate window size 3. Start the Iperf client (sender) on the application PC (core side for DL traffic, UE side for UL traffic) specifying the IP address of the Iperf server (application PC on UE side for DL traffic or application PC on the core side for UL traffic), transport protocol (TCP, UDP) and appropriate bandwidth (for UDP) 4. Verify the DL SINR, DL RSSI and cell edge DL user throughput using Iperf The following measurements/KPIs should be checked as the final scope is to verify throughput vs. SINR.   

4.4.6

SINR (DL/UL) RSSI (DL/UL) User Throughput for DL/UL

Drive Test configurations for Acceptance

Since FMT support several terminals simultaneously the common practice is to use a setup that allows measuring different KPIs simultaneously. Following tables present setup combinations of test cases that can be used for acceptance. Final combinations depend on the KPIs agreed with the customer including if the operator has 2G/3G network. These tables should only be considered as reference.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 32 / 44 For internal Use

Table 3: Test case description valid for cluster acceptance and single site verification Note: PING measurements in mobility tests can be a risk during HO situations so they can be removed. Each terminal is dedicated to measure different KPIs. In the example above:        

MS3 eUTRAN RAB Accessibility MS4 eRAB Drop Rate MS4 LTE DL Throughput (Mobility) MS4 LTE UL Throughput (Mobility) MS4 Intra LTE Handover Failure Rate MS4 Intra LTE Handover Interruption Time MS5 CSFB Success Rate MS5 CSFB Call Redirection Time

Example if acceptance includes stationary test for site validation:

Table 4: Test case description for stationary case Main KPIs:   

MS1 LTE Latency MS1 Peak LTE DL Throughput MS1 Peak LTE UL Throughput

Example configuration to test data session continuity (IRAT mobility to 3G/2G):

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 33 / 44 For internal Use

Table 5: Test case description for data session continuity (IRAT) Main KPIs:   

MS3 eUTRAN to UTRAN Session Continuity Failure Rate [%] MS3 Redirection Time to eUTRAN MS4 % of time in LTE

In this case, the drive route needs to be planned around the edge of LTE Coverage.

4.5 Single Site Acceptance If agreed with the operator in the preparation phase, the acceptance can include the single site performance verification. This task is a ‘sub-project’ itself and the increase in time, resources and cost of the acceptance needs to be considered and agreed with the customer beforehand. Single site acceptance can be a mix of stationary and mobility tests. The KPIs to be verified and target values need to be agreed with customer. 

Stationary tests: to be done on each sector in a location with excellent RF conditions. Some of the measurements/functionalities to verify the correct performance of the site: o CINR, o RSRP, o DL/UL throughput and o CSFB functionality



Mobility tests around the site: The scope for the mobility test around the site is to verify: o o o o o

Outgoing and incoming handovers for each sector, If there are X-feeder problems If the antenna azimuth and beamwidth are as expected. RF coverage and quality Throughput and latencies

Example test case description for single site verification can be found in Table 3. Same testing/verification procedure can be used for late sites. In that case, the drive route should be planned around the late site and the 1st tier of neighbors.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 34 / 44 For internal Use

5 Optimization Phase In this section of the pre-launch acceptance process the measurements from the drive test are analyzed to obtain the cluster/network KPIs and confronted with the agreed KPIs. In the event that after the initial drive the cluster already passes the KPIs for acceptance, it will not be necessary to effectuate any corrective action and the report for that cluster can be generated. However, it is common that some optimization (corrective action) is necessary after the analysis of the drive test measurements.

5.1 Post-processing tools There are several commercial post-processing tools available. Most on them in line with the drive test solutions. It is recommended to check [10]] for updates. Detailed information on NOKIA recommended Drive Test & Post Processing tools, Terminals, NPO contacts, supplier contacts, frame agreement and pricing conditions can be found in Drive testing and post-processing tools information and in the Tool Section within SharePoint 



 

 

Nemo Analyze: Part of NOKIA preferred tools it should be considered as de facto tool for Nemo Drive Test analysis. It has the best compatibility for the new Nemo Outdoor versions. It also supports (at additional cost) R&S and TEMS file format. Suitable for large data volumes and customized analysis and KPIs (for e.g. acceptance reports) is possible. It is possible to use NOKIA custom reports Excel based including integrated maps plots (graphs, charts). Actix Analyzer/Spotlight: It supports LTE file formats from many LTE drive test tools (Accuver XCAL, Nemo, TEMS, JDSU and R&S Romes). Latest information regarding supported files is available in Actix supported file formats .It also supports events like CSFB and IRAT. Customized analysis (for e.g. acceptance reports) is possible only with Analyzer Classic if needed. Actix may have problems when handling large data sets (days or weeks although it depends on the drive test tool). Operators may use ActixOne, a web based server highly customizable where drive test logs are uploaded into the ActixOne server that in turn generates custom reports for events and target KPIs. TEMS Discovery: It supports TEMS, Nemo Outdoor LTE, JDSU and QVoice LTE. Good for basic visualization use and creating customized reports (although more complicated than with Actix or Nemo Analyze). Supports easily large data sets. Candidate to be included within NOKIA preferred tools. XCAP (Accuver): It supports LTE for XCAL drive test tool. Used in early projects it has improved although features and usability are far from other tools. Not an NOKIA preferred tool. Gladiator from JDSU (ex-Agilent). Supports LTE for JDSU. Not an NOKIA preferred tool.

In addition to the commercial solutions the most popular NOKIA internal post-processing tools are: 

LTE FIT (Field performance Interpretation Tool). Originally designed to cover the lack of commercial post-processing tools, it processes LTE drive test data from several tools e.g. different terminals, Wireshark, XCAL, and application tools like IPERF, amongst others. It is also valid for FDD and TDD MAC TTI traces. Besides, it supports the automatic creation of reports with the post-

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 35 / 44 For internal Use

processed data in ppt. Download software, user guide and additional information can be downloaded from [1]. 

MUSA. It can be used for specified RF optimizations such as advanced coverage analysis and tilt/azimuth simulations from drive test measurements that are not typically available in 3rd party tools. More information on MUSA in IMS.

Additionally, there are tools that help to visualize certain performance data like GEEditor (Google Earth Editor).

5.2 KPIs KPIs are the network quality indicators used in the acceptance process; they prove the performance of the network. Drive test KPIs normally are divided in coverage KPIs and service level KPIs. Coverage KPIs are obtained from the scanner measurements whereas service level KPIs are derived from the measurement tool and related to the different applications tested. In general, service level KPI targets are valid in regions with sufficient coverage and good radio conditions. The KPIs available for analysis depend on the capabilities of the measurement system and on network features used. A project specific set of KPIs must be identified.

5.2.1

Coverage KPIs

Measurement sample analysis needs to take into account the coverage criteria determined by the reference signal RSRP and SINR. An example of how to calculate the coverage thresholds for certain service was already seen in Section Error! Reference source not found.. The reporting range for RSRP is [-44dBm… -140dBm]. The RSRP threshold values will depend on the KPI agreement with the customer and they will vary from project to project. They also may vary depending on the service that is tested. As previously mentioned, coverage is statistical in nature and the coverage measured by drive tests is related to the outdoor coverage probability (normally @95%). Coverage Ranges for Radio Conditions need to be agreed with the customer. Signal Conditions (100% Load)

SINR (dB)

Very Good

20 dB

Good

5~10 dB

Average

1~8 dB

Poor

<0 dB

RSRP (dBm) -80dBm -85 dBm -95 dBm <-105 dBm

Table 6 shows ranges agreed by NOKIA in live projects for different loads:

Signal Conditions (0% Load)

company.nokia.com

SINR (dB)

RSRP (dBm)

© Nokia 2014

13-05-2014 – D438013136- 36 / 44 For internal Use

Very Good

> 25

Good

15 - 25

Average

7 - 14

Poor

-2 - 6

Signal Conditions (100% Load)

SINR (dB)

Very Good

20 dB

Good

5~10 dB

Average

1~8 dB

Poor

<0 dB

> -75 -80 to -90 -90 to -100 <-100

RSRP (dBm) -80dBm -85 dBm -95 dBm <-105 dBm

Table 6: SINR and RSRP Ranges The above categories are also used when determining stationary test locations. SINR locations should be chosen such that the prevailing SINR is approximately in the middle of the given range. Note: Measured SINR values depend on the used measurement tool. There can be up to 15 dB difference in the measured SINR values in the same location, if different tools (e.g. a scanner vs. a terminal) are used. More information is available in [9]. RSRP thresholds can be more reliable.

5.2.2

Service Level KPIs

NOKIA has specific documentation to cover for Service Level KPIs. References for target values can be found in: 

MBB and NPO KPI commitment targets guideline to be used as the main source of information when it comes to KPI commitments. It includes KPI target values accounting for NPO services.



LTE Field KPI targets document available for each release. It contains target values achievable in lab and field conditions and references about how to achieve those values.



MBB Performance Benchmarker tool, accessible also from MINT platform. It contains live KPI values from NOKIA LTE networks so it is a good reference for achieved values. It is necessary to be registered in order to access the tool. Instructions for registering are provided in the welcome page of the tool (link)

5.2.3

OSS KPIs

There are cases in which OSS KPIs are considered together with drive test KPIs as part of the acceptance project. It is necessary to agree with the customer on the OSS KPIs, the target value, the area (cluster/network wide) and the period over which they will be measured. company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 37 / 44 For internal Use

It is also necessary to have traffic on the network that generates data that makes the OSS KPIs statistically reliable. This traffic may be generated by ‘friendly users’ such as operator employees and NOKIA engineers involved in the project or by operator customers if the roll-out is done in several phases. Table below shows an example from a real project with proposed KPI targets.

Table 7: Network Acceptance Performance Statistical Requirements

5.3 Data Post-processing Considerations

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 38 / 44 For internal Use

5.3.1

Accessibility

Accessibility is referred to the accessibility of a service to its users. It is expressed as a ratio of successful service requests to the total number of attempts. For accessibility test to be meaningful a significant number of samples need to be collected. E.g. for a success rate with 0.2% granularity, at least 500 samples are needed. Service request attempts, rejected due to insufficient access rights, bad user authentication, or errors in request parameters are not considered and should be taken out of the samples used for calculating the metric.

5.3.2

Throughput

For simplicity, data can be averaged over the measurement time. This means that for each download and upload, the following is calculated:   

Average IP throughput (Mbps) Average vehicle speed (km/h) Average SINR (dB) CDF

The average IP throughput is calculated as the File size (in Mbit/s) / (stop time – start time). The speed and SINR are calculated as the average value of the available samples between the start and stop time of the data transfer. If there are no samples available between the start and stop time of the data transfer for any of the variables, the test step is excluded.

5.3.3

Reliability

Reliability (also retainability) is measured on all cycles with a successful connection attempt and does not have any signal quality requirements. It refers to the ratio between the number of successful calls and the total number of calls successfully set up. If given as a drop rate then is the ratio between abnormal terminations and the number of all successfully established connections. To be able to measure the ratio with statistical significance it is necessary to have more than 100 samples.

5.3.4

Round Trip Time

Round Trip Time (RTT) is the common measurement for finding E2E latency in LTE access networks. It is measured with the ping application. The recommended amount of measurements is >=100. The first IP packet may have additional delay caused by the dynamic allocation o radio resource to non -GBR bearers. For this reason, the round trip delay of the first packet should not be considered in statistics calculations. Ping tests are done under stationary and mobile conditions. RTT is measured on all successful samples and does not have any signal quality requirements. company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 39 / 44 For internal Use

It is normal to see spikes when measuring the RTT. These spikes are due to radio HARQ retransmissions. Target BLER is by default 10% in uplink and downlink so on average, one of each 10 initial transmissions will have retransmissions. Each HARQ retransmission results in 8ms delay increase which means spikes of 8 ms are normal. This should be taken into account in KPI calculations. It would be better to use median, minimum or trimmed mean values rather than average values for RTT to eliminate the impact of HARQ.

5.4 Corrective actions The drive test measurement analysis will show network issues that affect the achievement of KPIs. Some corrective actions are therefore necessary in order to improve the network quality for the acceptance. The usage of OSS tools to implement some of the corrective actions NetAct Optimizer (i-SON Manager) can be considered. Common actions taken during the acceptance phase are:  Antenna tilt changes  Antenna azimuths  Parameter modifications (e.g. Handover parameters, Power control parameters…)  Finding core network issues or transport network capacity issues  Finding/correcting crossed feeders  Neighbour optimization (e.g. missing neighbors, deletion of ANR added neighbours) In LTE the Physical Layer cell ID can be used in a similar way to scrambling codes in 3G. Since the drive route should cover all sectors of all sites in the cluster it should be possible to verify that the correct cell ID is received in each sector. If there are crossed feeders then there could be a danger of receiving multiple cells ID in one sector and no cell ID in another sector, or the sectors could appear swapped over. In case of LTE the probabilities of X-feeders increase due to the usage of MIMO 2x2. For more information about how to detect X-feeders refer to [8]. The scope of the corrective actions is to improve the network performance. Another drive will be necessary after the corrective actions have been implemented to verify the performance and specially to ensure the performance has not degraded.

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 40 / 44 For internal Use

6 Reporting Once the cluster/network has passed the agreed KPIs it is necessary to generate a report for customer for approval. The report has to summarize the improvements achieved during the drive tests by comparing the initial cluster status (after the initial drive) with the final cluster status. The format and contents of the report need to be agreed with the customer and it may vary from project to project. As a general recommendation, the report should clearly show the KPIs and if the cluster has passed or not the acceptance based on those KPIs already in the executive overview to facilitate the cluster approval by high level management on customer side that may not necessarily be interested in more detailed information. It is common to use automated reports created by the post-processing tool (e.g. Actix or ActixOne) or NOKIA specific automated scripts for tools like Nemo that in addition to facilitate the performance analysis can be presented to the customer for acceptance. A collection of customer reports used for acceptance is available at Sample Reports for Acceptance. Please, notice this information is NOKIA confidential and shouldn’t be shared as such with other customers. This section presents a summary of the main sections that the final report should include: Executive overview Achieved cluster/network KPIs. Following KPIs can be included in the Acceptance report:          

Network Attach Success Rate [%] (accessibility) PS Data Call Success Rate [%] (accessibility) PS Data Call Drop Ratio [%] (retainability) Average Throughput DL [Kbps] (single and/or multiple user) Average Throughput UL [Kbps] (single and/or multiple user) Intra –LTE Handover Success Rate [%] Inter-RAT Handover Success Ratio [%] (when applicable) Latency (RTT) CSFB Success Rate [%] to 2G/3G (when applicable) CSFB Call Redirection Time to 2G/3G (when applicable)

Cluster status (pass/no pass) Measurement overview Drive test:  Drive Route Summary  Results, Plots and Graphs  Analysis (problem areas summary)  Solved (what has been done)  Unsolved  Unsolvable (as identified under “exclusions”)  Future Solutions (Sometimes it can be to build a new site, proposal of new features) Corrective actions

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 41 / 44 For internal Use

  

Hardware/implementation issues Azimuth, tilt adjustment Parameter Tuning or Change Proposal

Site Database  Confirmed site database Anomalies  Hardware problems during test  Performance degradation for reasons outside of RF (uncontrollable)

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 42 / 44 For internal Use

7 References [1] LTE FIT Tool Home Page https://sharenet-ims.inside.nokiasiemensnetworks.com/Open/403191108 [2] LTE RNP Guidelines

https://sharenet-ims.inside.nokiasiemensnetworks.com/Open/407824118 [3] Automated scripts for Nemo

https://sharenet-ims.inside.nokiasiemensnetworks.com/Open/362255067 [4] LTE Product Support in IMS. ATMN tests within each RLxx folder:

https://sharenet-ims.inside.nokiasiemensnetworks.com/Open/TS_LTE_Public [5] LTE Field Performance RL50 folder

https://sharenet-ims.inside.nokiasiemensnetworks.com/Open/494850933

[6] MBB and NPO KPI Commitments Guideline

https://workspaces.emea.nsnnet.net/sites/QoS/Documents/04%20Tools/KPI_commitment_targets_2.0_FINAL_new_NSN.pptx [7] LTE Optimization Training

https://sharenet-ims.inside.nokiasiemensnetworks.com/Open/488706327 [8] LTE Optimization Guidelines

https://sharenet-ims.inside.nokiasiemensnetworks.com/Overview/D415266561 [9] SINR Measurement and scanner comparison

https://sharenet-ims.inside.nokiasiemensnetworks.com/Overview/D425813131 [10] T. Mecklin: Post-processing tools status

https://sharenet-ims.inside.nokiasiemensnetworks.com/Overview/D403860102

8 Glossary ANR ATMN CDF

Automatic Neighbour Relations Acceptance Test Manual Cumulative Distribution Function

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 43 / 44 For internal Use

CM CQI EMM ECM EPC FER FMT FTP GBR HO HSS IMS KPI MAPL MCS MHA MOC MTC nGBR OFDMA PCI PDCCH PDCP PDSCH PM PRACH P-SCH QoS QCI RAB RB RACH RAT RE RLC RS RSRP RSRQ RSSI RTT SAE-GW SC-FDMA SFN SINR S-SCH TCP UDP UMTS

Configuration Management Channel Quality Indicator EPS Mobility Management EPS Connection Management Evolved Packet Core Frame Error Rate Field Measurement Tools File Transfer Protocol Guaranteed Bit Rate Handover Home Subscriber Server IP Multimedia Subsystem Key Performance Indicator Maximum Allowed Path Loss Modulation and Coding Scheme Mast Head Amplifier (also TMA: Tower Mounted Amplifier) Mobile Originated Call Mobile Terminated Call non Guaranteed Bit Rate Orthogonal Frequency Division Multiple Access Physical Layer Cell Identity Physical Downlink Control Channel Packet Data Convergence Protocol Physical Downlink Shared Channel Performance Management Physical Random Access Channel Primary Synchronization Channel Quality of Service Quality of Service Class Identifier Radio Access Bearer Resource Block Random Access Channel Radio Access Technology Resource Element Radio Link Control (Layer) Reference Signal Reference Signal Received Power Reference Signal Received Quality Received Signal Strength Indicator Round Trip Time System Architecture Evolution - Gateway Single Carrier Frequency Division Multiple Access System Frame Number Signal-to-Interference and Noise Ratio Secondary Synchronization Channel Transmission Control Protocol User Datagram Protocol Universal Mobile Telecommunications System

company.nokia.com

© Nokia 2014

13-05-2014 – D438013136- 44 / 44 For internal Use

company.nokia.com

© Nokia 2014

More Documents from "Arun Srivastava"