share article

Share on facebook
Share on twitter
Share on linkedin

Compact RF testing solutions for 5G millimeter wave cellular networks


By Stojce Dimov Ilcev, Durban University of Technology

The implementation of the fifth-generation (5G) cellular network technology allows telecom operators and Internet service providers to meet the ever-increasing demands for higher transmission data rates and greater access capacity for their customers. The concept of 5G wireless system is be­ing introduced in different ways and there is still a lot unknown about how network operators, providers, homeowners, businesses and mobile customers will use emerging 5G networks and devices. In fact, some cellular operators consider 5G wireless system as an the incentive for increased bandwidth towards cellular devices with currently used spectrum, below 6GHz, whilst others see 5G as a fixed mobile wireless access that replaces the legacy, wired infrastructure operating in the millimeter-wavelength range above 28GHz. Regardless of access and implementation, the goal of 5G is to enable wireless data speeds greater than 1Gb/s, and potentially over 20Gb/s, and to provide adequate RF testing solution for 5G millimeter wave (mmWave) cellular networks.

Cellular wireless backhaul networks

During the last few decades, cellular wireless communication evolved an early, analogue, voice-only system to today’s intelligent communication system; see Figure 1. In fact, with the continuous progress of each generation, cellular communication systems are becoming more sophisticated and versatile, enabling new consumer services that support countless cellular (fixed and mobile) wireless applications. In 1980s, the use of G1 analogue wireless voice communications began, whilst in the 1990s an improvement was made by introducing a new generation of G2 digital voice wireless communications with higher transmission capacity.

At the beginning of the new millennium, 3G was introduced in 2000, bringing the wireless data with access to the Internet broadband anytime and anywhere. This cellular broadband network combined with an innovative smartphone technologies brought a significant change of wireless Internet experience where users can access their online email, social media, music, high-definition video streaming, gaming, and so on, as the app-centric interface.

Then, during 2010, 4G technology deployed Long Term Evolution (LTE) standard, with IP core video and data streaming for wireless broadband via cellular devices and data terminals, based on Global System for Mobile Communications (GSM) / Enhanced Data rate for GSM Evolution (EDGE) and Universal Mobile Telecommunications Service (UMTS) / High-Speed Packet Access (HSPA) technologies.

Finally, in early 2020, a new era begun with the introduction of the modern solutions for broadband, ultra high-definition (HD) imaging resolution and ultra-low latency slicing that enable a scalable Software-Defined Networking (SDN) core networks.

Figure 1: The evolution of cellular wireless communication standards

New OTA RF test methodology

The 5G networks are already rolling out, hence it’s important for operators to provide the methodology involved in RF testing of 5G mmWave over the air (OTA) standards for cellular devices. The 5G massive technologies such as Multiple-Input/Multiple-Output (MIMO) devices require OTA calibration and RF measurement in a phase coherent test system. Device and antenna array size make it very challenging to measure mmWave RF performance in a true distant-field OTA environment due to the large distances required, which also impact the link budget and test system cost.

The mmWave midfield (MF) OTA RF test methodology is designed to minimise size and cost. This advanced system can perform radio antenna array phase coherence calibration and support RF parametric measurement per third Generation Partnership Project (3GPP) conformance test specifications.

The mm-Wave RF band offers huge bandwidth that is unparalleled compared to the frequency spectrum available today for most wireless fixed and mobile communication and radar sensor systems. Consequently, mm-wave frequencies are currently being exploited for the development of emerging broadband, high-speed and high-resolution RF systems. For example, the 5G of the wireless cellular technology is expected to operate in the mm-wave band, so as to overcome the limitations of 4G. Unlike 4G, 5G networks and systems de facto will be optimised for both Human-Centric Communication (HCC) and Machine-Type Communication (MTC). Therefore, 5G has the potential to be a key enabler of emerging bandwidth, speed and latency-critical HCC and MTC applications, such as remote medical diagnosis and surgery, vehicle-to-vehicle and vehicle-to-infrastructure communication for autonomous driving, wireless transmission of uncompressed ultra HD videos, augmented and virtual reality, tactile internet and a multitude of Internet of Things (IoT) applications which lead to smart solutions, such as health, energy, city, transportation, industry and home.

The new developed 5G cellular networks and telephone devices already have began their early production phase in 2019, while several RF test solutions are being evaluated for the next phase of large-scale production. Thus, the cellular 5G system will play a critical role in meeting the bandwidth requirements of mobile customers from 2020 onwards. The ubiquity of smartphones, particularly in developed markets, continues to exponentially increase mobile data consumption. As the load on networks rises, backhaul (the connection of the base station to the core network) must keep pace. For 5G mmWave products as a new backhaul capacity, airborne tests represent a new set of test challenges compared to products below 6GHz.

Each system is different, and the set of RF interference challenges varies in any cellular device or system development and deployment. This diverse range of unique testing requirements of calibrated noise sources and programmable noise generators are highly customisable. Thus, as industry moves toward these higher speeds and new spectrum, new components are required to build the radios necessary for 5G deployment. Devices are becoming more tightly-integrated, more MIMO devices are being incorporated into designs, and high-frequency phased array systems are being deployed. These new devices require new testing techniques in the lab, in production and in the field to ensure the high-data rates being quoted are reliable in the real world when in the presence of real world noise and interference.

The OTA RF test approaches are becoming the standard for testing 5G New Radio (5G NR) User Equipment (UE) and base stations (BS), especially in mmWave. The move to higher frequencies, including sub-6GHz Frequency Range 1 (FR1) and mmWave Frequency Range 2

(FR2), arises in large part to the crowding of the RF spectrum. Therefore, for these reasons, 5G NR networks and UE devices require advanced technologies and OTA measurement methods to characterise performance accurately.

 Figure 2: Noise power ratio testing

Noise power ratio measurements

Noise power ratio (NPR) measures the inter-modulation and noise floor of RF transponders and components in wireless systems. NPR is defined as the relationship between the total power density in a channel and the noise power density, and is commonly used to describe the distortion caused by in-band multi-carrier inter-modulation in a wideband radio channel.

The NPR concept has been around since the early days of Frequency Division Multiplexed (FDM) transmission systems; it is simply used to measure the “quietness” of an unused channel in a multi-channel system when there’s random activity on the others. Noise and inter-modulation distortion products fall into the unused channel causing less than ideal performance. Originally used to check 4kHz-wide voice channels in FDM links, the same concept is useful today in characterising multi-channel wideband communication systems; however, there are some important differences in the modern measurement techniques.

High requirements for 5G data speeds push the limits of the video amplifier range from 10MHz to 500MHz and beyond. To deliver these data rates, amplifiers must be tested to see how they reproduce complex modulated signals with high peak-to-average power ratios. Thus, amplifier performance is important in the system because nonlinearity will reduce the dynamic range of the communication channel by limiting the lowest level of signal strength that can be received incorrectly.

NPR is a convenient way to test for nonlinearity, and the measurement can be done at component or system level. Traditionally, two-tone testing is the method used to test for inter-modulation distortion of amplifiers, but this method has limitations since the signals used do not emulate the higher peak-to-average power ratios that the amplifier is required to handle. The peak-to-average ratio of multiple Orthogonal Frequency Division Multiplexing (OFDM) signals is much greater than that experienced during the two-tone test. The large peaks stress the amplifier or device under test to a greater degree making the two-tone test less useful and making a noise power ratio measurement a better figure of merit for system performance.

NPR testing is done with a broadband noise source that represents any or all the carriers in the specified operating bandwidth of an amplifier. Using a noise source has the added benefit of being very economical when compared to high-end signal generators required to create tones in the millimeter wave range.

Figure 2 shows R&S FSW-K19 noise generator that creates white noise to simulate a number of carrier or channels that are operating concurrently. A deep notch is created in the band of noise, typically the centre of the band, and the test signal is then applied to the device or system under test. The amount that the notch fills in gives an indication of the non-linearity generated in the R&S FPC1500 spectrum analyser device- or system-under-test and is used to determine the NPR of an active component in the system or the system as a whole.

 Figure 3: NPR measurement

As mentioned previously, NPR testing was historically developed to evaluate FDM communication systems. A typical system might consist of 4kHz wide voice channels, stacked up for transmission into a higher bandwidth signal. At the receiving end, the FDM data was de-multiplexed and converted back to 4kHz voice channels. Noise and distortion was added to the signal after passing through amplifiers, repeaters, channel banks, etc. Thus, because analogue-to-digital converters (ADCs) also process broadband signals, and because their specific applications are varied, their figure of merit finds a modern-day application in evaluating ADC performance.

Equipped with the R&S FSW-K19 option, the R&S FSW signal and spectrum analyser offers a convenient and straightforward way to measure the NPR over a maximum of 25 notches. The NPR resting is how quiet one unused channel in a wideband system remains when the other channels cause noise in it due to inter-modulation. Thus, this concept for test signals are usually generated at Intermediate Frequency (IF) and up-converted to the RF band of interest. The NPR test is the ratio of the sum of the power inside the notched bins (PNi) to the sum of the power in an equal number of bins outside the notch (PNo) presented by the following relation:

NRP = 10log10 PNi / PNo

In fact, care must be taken to insure this process does not degrade the notch depth. In wideband NPR testing, the noise figure of the amplifiers can often limit the maximum NPR measured. Figure 3 shows a common method of NPR measurement. At this point, either a Gaussian or uniformly-distributed signal may be used for the noise source, but for ADC devices, the Gaussian source is more relevant. A low-pass filter is used to prevent noise aliasing, which would lead to a higher, inaccurate NPR measurement.

In this process it will be necessary to introduce the Y-factor, which shows the ratio of hot and cold power of the DUT. The Y-factor indicates the quality of measurement tolerances and uncertainties. To get the result, the application measures the DUT power with the noise source turned on (hot power) and the noise source turned off (cold power) presented such as:

Y-factor = Non / Noff

where Non = Noise power (in dB) with noise source off; and Noff = Noise power (in dB) with noise source on.

Noise figure measurements

Noise figure is a key performance parameter for any RF or millimeter-wave component or system, which gives the designer an indication of the degradation of signal-to-noise in a system due to the noise generated or present in the system. The lower the noise figure value, the better the system performance, the lower the signal-to-noise degradation.

As communication data rates and operating frequencies increase for 5G applications, systems are even more sensitive to signal-to-noise degradation on the communication links, and there must be a high-degree of confidence that what’s designed in the lab will operate in the field. Measuring noise figure value can be done in a variety of ways and with a range of different test equipment including spectrum analys­ers, noise figure meters, noise figure analysers, or vector network analysers. Regardless of testing technique, any noise figure measurement requires an accurately-calibrated broadband noise source.

Figure 4: Noise figure measurement with a calibrated noise source

The maximum data transmission rate of a communications system and the sensitivity of a radar system are largely dependent on the signal’s Signal-to-Noise Ratio (SNR). The SNR mainly depends on the noise factors of the components in the signal path, particularly with low input levels. The noise factor is the ratio of the input SNR to the output SNR of a linear two-port device, such as an amplifier. The noise factor is frequency-dependent and is typically given as a decimal logarithmic value in decibels (dB), in which case it is called the noise figure. Thus, knowledge of the exact noise figure value is essential for the development, optimisation and production of virtually all RF systems.

In the past, noise factors were determined using a dedicated noise tester, but now noise figures and gain are often measured with a R&S FPC1500 spectrum analyser, which block diagram for noise figure measurement is shown in Figure 4. In this configuration are included R&S FSW-K19 noise generator, switch, calibration, isolator and amplifier under test. The spectrum analyser measures the values of the noise figure of a DUT, and displays the results graphically and numerically. In fact, each graphical result display shows the noise figure from a different perspective. In the default configuration, the application shows the noise figure of the DUT, the gain of the DUT and the corresponding Y-factor. In addition, it shows the numerical results of the measurement, where the noise figure is the ratio of the SNR value at the DUT input (SNRin) to the SNR value at the DUT output (SNRout), by the following relation:

Noise Figure = SNRin / SNRout

From a physical point of view, the noise figure and the noise temperature levels have a positive range (including zero). Due to the mathematical operations the application performs, the results can be negative. However, sometimes this happens due to incorrect calibration or variance of measurement values. This is done using the Y-factor method, which yields accurate results even with small noise figures. For this measurement, a noise source with a known Excess Noise Ratio (ENR) is used in addition to the spectrum analyser. The ENR indicates the increase in spectral intensity of the noise, known as Power Spectral Density (PSD) when the noise source is switched on.

Noise figure values can be measured with a variety of different pieces of test equipment from different manufactures, but the measurement process always relies on an accurately-calibrated broadband noise source for reliable, repeatable measurements. In the example of Figure 4, a spectrum analyser is being used to perform the noise figure measurement and supply the DC voltage to power the noise source. The noise source is first used without the amplifier under test switched in to calibrate the test system. In such a way, to perform the noise figure measurement of the amplifier it is switched back in. The isolator and low-noise amplifier in the setup are used to reduce measurement uncertainty by reducing reflected power between components in the test system and by reducing noise figure of the test setup itself. The overall accuracy of the noise figure measurement will be primarily determined by the accuracy of the noise-source calibration.

 Figure 5: OTA testing with a calibrated noise source

OTA testing and chamber calibration

As chipsets for 5G become more integrated, with on-board power amplifiers and antennas, and frequencies increase to mmWave range, conducted RF power measurements are now physically impractical or even impossible to perform. These new MIMO devices provide minimal access to test points for making measurements, and physical connection for conducted testing is not possible; instead, radiated or over-the-air (OTA) testing is required.

The advanced MIMO technology uses multiple antennas at the transceiver to improve transmis­sion performance. Either multiple trans­ceivers transmit via separate antennas over uncorrelated propagation paths to achieve higher throughput for one or more users (spatial multiplexing), or the same output signal is transmitted via multiple antennas and combined in the receiver (Rx) to improve signal quality (Rx diversity). Thanks to the large number of antenna elements in massive MIMO systems, both concepts can be combined. Thus, an antenna radio system that supports beam­-forming as well as spatial multiplexing is known as a massive MIMO system. Although massive MIMO is applied only in base stations, wireless devices are also using increasing numbers of anten­nas to implement MIMO techniques.

In an OTA testing system, a calibrated noise source outside the chamber is connected to a transmit antenna inside the chamber; see Figure 5. Receive antennas inside the chamber are connected to an adequate instrument outside the chamber. The noise source can have one or two known ENR values with calibration data for the bandwidth of interest. Thus, the benefit of having two ENR levels is the ability to determine Y-factor noise figure of the device under test (DUT) for radiated measurements. In fact, the chamber may be equipped with a precision positioner that can manipulate the device under test so that the receive antennae may be exposed to the calibrated output of the noise source. In this example a calibrated noise source outside the chamber is connected to a radiating reference antenna with a known gain and bandwidth, inside the chamber. Multiple receive antennas are positioned inside the chamber and connected to a spectrum analyser outside the chamber. Capturing data from each antenna and comparing it to the reference signal generated by the noise source pro­vides a quantified and calibrated model of the chamber and antennas in the test system.

To overcome the lack of physical connections, engineers are developing OTA testing techniques to quantify and analyse devices inside RF test chambers. These chambers allow devices to be remotely activated and subjected to a variety of tests with transmit and receive antennas inside the chamber. Thus, these antennas are typically connected to a variety of signal sources to stimulate the device and measuring instruments like spectrum analysers, vector network analysers or power meters to capture and measure the response.

In order to make reliable and repeatable measurements inside any chamber, the chamber and test system as a whole, needs to be calibrated and quantified with certification. Noise sources are the ideal device for this type of calibration process as they provide a known source with calibrated data points which can be used to determine cable loss, air path loss, radio antenna efficiency, and total chamber response.

After the system is calibrated and quantified, the same noise sources with known characteristics can be used as a reference source for the DUT to receive signals. Noise sources for OTA testing also act as a cost-effective alternative to expensive microwave and millimeter-wave signal generators.

The 5G mmWave devices operating above 24GHz incorporate millimeter-sized patch antenna arrays or dipole antennas that become an integral part of the device module packaging, making OTA testing the only way to characterise and test their performance. However, a mmWave test chamber can introduce significant path loss, more than from used cables and connectors. Understanding how to calculate the total link budget for OAT is a critical step in 5G mmWave OTA test. Each patch antenna element in a 5G mobile or fixed-access device can transmit or receive electromagnetic waves in either the vertical or horizontal direction.

Unlike wireless products operating at sub-6 GHz frequencies, mmWave products introduce a new test challenge related to OTA testing. Namely, over-the-air path losses, measured in dB, can be significant at mmWave frequencies relative to contacted cable and connector losses. For example, a 2.92mm connector-cable assembly can have a path loss of about 2.75dB/m at 40GHz, whereas the over-the-air path loss at the same frequency is about 64dB at an over-the-air distance of 1m. Using the following equation can be calculated the OTA path losses when the distance R between the transmitting and receiving antennas are equal or greater to the far-field (FF) region:

OTA-TestPATH LOSS = 10log102 / (4πR)2 dB

Where λ is the wavelength of the operating frequency in meters and R is equal or greater than the FF region distance, which is the distance at which the spherical waves can be considered as a “plane” wave at the receiving antenna, thus fulfilling the following mathematical requirement:

R ≥ 2D2 / λ

where value D is the largest dimension of the apertures (that is, the maximum effective size of the antenna) of either the transmitting (D1) or receiving (D2) antennas.

Calculating the total OAT test chamber link budget is critical for making accurate DUT antenna measurements. The resulting link budget net loss needs to be combined with the measurements made by the mmWave tester instrument to determine the actual radiated power and phase being generated by each DUT antenna array elements, or, likewise, when generating mmWave signals from the test horn antenna into the DUT. Once the choices have been made for a DUT antenna array size with aperture D1, a chamber test horn antenna with aperture D2 and a chamber with a far-field distance R, Friis transmission equation can be used to calculate the overall link budget of the OTA test setup, as per the following:

PR = GR GT PT2 / (4πR)2]

where PR = Power at the receiving antenna; GR = Gain of the receiving antenna; GT = Gain of the transmitting antenna; PT = Power at the transmitting antenna; and R = FF distance between the two antennas.

With the advent of 5G mmWave wireless devices of various sizes and applications, each requiring different architectures and sizes of mmWave antennas, it’s critical for test engineers to understand the differences in OTA test chambers and test techniques. Direct Far Field (DFF) and Compact Antenna Test Range (CATR) are two types of this test methods supported by the 3GPP Technical Report (TR) 38.810 Study on Test Methods for 5G Frequency Range 2 (FR2) in mmWave bands devices. Since CATR OTA test chambers can cost up to ten times more than DFF chambers, a test engineer must decide which one is best suited for the intended application and test requirements.

Share this article

Share on facebook
Share on twitter
Share on linkedin

Related Posts

View Latest Magazine

Subscribe today

Member Login