This paper presents the first receiver supporting non-contiguous intra-band and inter-band carrier aggregation, capable of receiving up to three 20 MHz LTE carriers simultaneously. The single-chip receiver implements a reconfigurable architecture in 65nm CMOS, occupies 14.8 mm2 and consumes 155mW and 435 mW when receiving one and three carriers, respectively.
The objective of this paper is to improve the knowledge on directional channel characteristics at the base station, particularly concerning elevation. For this purpose a channel measurement campaign has been performed. A powerful new method for super-resolution channel estimation has been used to get a detailed picture of the directional characteristics of the channel. This has further led to improved knowledge of when processes like diffraction over rooftops and/or specular reflections are important. The findings herein have been incorporated into a model for the elevation angle dispersion which is proposed as an extension to some commonly used directional channel models such as the ITU-IMT-Advanced model.
This paper provides a high-level overview of LTE Rel-10, sometimes referred to as LTE-Advanced. First, a brief overview of the first release of LTE and some of its technology components are given, followed by a discussion on the IMTAdvanced requirements. The technology enhancements introduced to LTE in Rel-10, carrier aggregation, improved multiantenna support, relaying and improved support for heterogeneous deployments, are described. The paper is concluded with simulation results, showing that LTE Rel-10 fulfills and even surpasses the requirements for IMT-Advanced.
The potential link performance gain obtained with up to 8x8 MIMO transmission as standardized in 3GPP LTE Release 10 have been evaluated in an indoor measurement campaign using a testbed implementation. For well-separated antennas, the result show increasing downlink throughput when increasing the number of transmit and receive antennas, up to a median throughput of 335 Mbps for an 8x8 MIMO configuration on a 20 MHz carrier. A similar and only slightly smaller throughput is achieved when using a compact UE array of a size that is more reasonable for a consumer device implementation.
MIMO is one of the techniques used in LTE Release 8 to achieve very high data rates. A field trial was performed in a pre-commercial LTE network. The objective is to investigate how well MIMO works with realistically designed handhelds in band 13 (746-756 MHz in downlink). In total, three different handheld designs were tested using antenna mockups. In addition to the mockups, a reference antenna design with less stringent restrictions on physical size and excellent properties for MIMO was used. The trial comprised test drives in areas with different characteristics and with different network load levels. The effects of hands holding the devices and the effect of using the device inside a test vehicle were also investigated. In general, it is very clear from the trial that MIMO works very well and gives a substantial performance improvement at the tested carrier frequency if the antenna design of the hand-held is well made with respect to MIMO. In fact, the best of the handhelds performed similar to the reference antenna.
Multiple-input–multiple-output (MIMO) is a technique to achieve high data rates in mobile communication networks. Simulations are performed at both the antenna level and Long-Term Evolution (LTE) system level to assess the performance of realistic handheld devices with dual antennas at 750 MHz. It is shown that MIMO works very well and gives substantial performance gain in user devices with a quarter-wavelength antenna separation.
In this paper we investigate the feasibility of using microwave frequencies for fixed non-line-of-sight wireless backhauling connecting small-cell radio base stations with an aggregation node in an outdoor urban environment, i.e. a typical heterogeneous network scenario. We study system level simulations for a point-to-point system where the wave propagation is based on diffraction over rooftops. We further investigate the effects of carrier frequency, interference, antenna height, rain, and tolerance to antenna alignment errors. It is found that the higher frequencies offer not only larger bandwidths but also higher antenna gains which would ideally work to their advantage. However, these advantages may be lost when taking antenna alignment errors and rain into account. Different frequencies simply have their different trade-offs.
Since the seminal paper by Knopp and Humblet that showed that the system throughput of a singlecel l system is maximized if only one terminal transmits at a time, there has been a large interest in opportunistic communications and its relation to various fairness measures. On the other hand, in multicel l systems there is a need to allocate transmission power such that some overall utility function is maximized typically under fairness constraints. Furthermore, in multicell systems the degree of resource allocation freedom includes the serving cell selection that allows for load balancing and thereby the efficient use of radio resources. In this paper we formulate the joint serving cell selection (link selection) and power allocation problem as an optimization task whose purpose is to maximize either the minimum user throughput or the multicell sum throughput. The max-min problem and a simplified max throughput problem are both NP-hard and we therefore propose heuristic solution approaches.We present numerical results that give new and valuable insights into the trade off between fair and sum throughput optimal joint resource allocation strategies.
In radio network performance evaluations it is of interest to use simulation environments that represent realistic networks. In this paper two widely used simulation environments are benchmarked against live network path gain measurements. Some parameter adjustments are done to fit the simulation environments to the measured path gain and cell isolation statistics. The need to modify base station antenna model parameters to match the intra-site cell isolation in real networks is identified and elaborated. In particular an increase of maximum attenuation to 35dB, introduction of mechanical tilt and reduction of vertical beam width are needed. In addition some adjustments are done to better represent the assumed traffic distribution with 80% indoor users and expected propagation losses resulting in a path gain level on par with the measured network.
In this paper we introduce a novel framework for traffic identification that employs machine learning techniques focusing on the estimation of multiple traffic influencing factors. The effect of these factors is handled with the training of several machine learning models. We utilize the outcome of the multiple models via a recombination algorithm to achieve high overall true positive and true negative and low overall false positive and false negative classification ratio. The proposed method can improve the performance of every kind of machine learning based traffic identification engine making them capable of efficient operation in changing network environment i.e., when the probing node is trained and tested in different sites.