Since the seminal paper by Knopp and Humblet that showed that the system throughput of a singlecel l system is maximized if only one terminal transmits at a time, there has been a large interest in opportunistic communications and its relation to various fairness measures. On the other hand, in multicel l systems there is a need to allocate transmission power such that some overall utility function is maximized typically under fairness constraints. Furthermore, in multicell systems the degree of resource allocation freedom includes the serving cell selection that allows for load balancing and thereby the efficient use of radio resources. In this paper we formulate the joint serving cell selection (link selection) and power allocation problem as an optimization task whose purpose is to maximize either the minimum user throughput or the multicell sum throughput. The max-min problem and a simplified max throughput problem are both NP-hard and we therefore propose heuristic solution approaches.We present numerical results that give new and valuable insights into the trade off between fair and sum throughput optimal joint resource allocation strategies.
In radio network performance evaluations it is of interest to use simulation environments that represent realistic networks. In this paper two widely used simulation environments are benchmarked against live network path gain measurements. Some parameter adjustments are done to fit the simulation environments to the measured path gain and cell isolation statistics. The need to modify base station antenna model parameters to match the intra-site cell isolation in real networks is identified and elaborated. In particular an increase of maximum attenuation to 35dB, introduction of mechanical tilt and reduction of vertical beam width are needed. In addition some adjustments are done to better represent the assumed traffic distribution with 80% indoor users and expected propagation losses resulting in a path gain level on par with the measured network.
In this paper we introduce a novel framework for traffic identification that employs machine learning techniques focusing on the estimation of multiple traffic influencing factors. The effect of these factors is handled with the training of several machine learning models. We utilize the outcome of the multiple models via a recombination algorithm to achieve high overall true positive and true negative and low overall false positive and false negative classification ratio. The proposed method can improve the performance of every kind of machine learning based traffic identification engine making them capable of efficient operation in changing network environment i.e., when the probing node is trained and tested in different sites.
Device-to-device (D2D) communications underlaying a cellular infrastructure has been proposed as a means of taking advantage of the physical proximity of communicating devices, increasing resource utilization, and improving cellular coverage. Relative to the traditional cellular methods, there is a need to design new peer discovery methods, physical layer procedures, and radio resource management algorithms that help realize the potential advantages of D2D communications. In this article we use the 3GPP Long Term Evolution system as a baseline for D2D design, review some of the key design challenges, and propose solution approaches that allow cellular devices and D2D pairs to share spectrum resources and thereby increase the spectrum and energy efficiency of traditional cellular networks. Simulation results illustrate the viability of the proposed design.
The concept of antenna muting for reducing energy consumption in LTE is presented and system level evaluation results are provided. The results indicate that antenna muting can reduce the energy consumption with up to around 50% in a low load scenario without significantly affecting the user throughput. Results for 4TX, 2TX, and 1TX cell configurations are presented. The system level simulator used includes detailed models of UE pre-coder selection and feedback and we show in this paper that these algorithms performs well also when antennas are muted in a way that was not considered during the design of the algorithms. Antenna muting is a promising technique that operates on a rather short time scale in order to reduce the energy consumption of an LTE cell.
In recent years, there has been a strong focus on network management simplicity under the device selforganizing networks (SON). A number of SON use cases and features have been and are discussed for 3G Long Term Evolution (LTE). The main SON feature in the first LTE release is methods for automatic configuration of neighbor cell relations – Automatic Neighbor Relations (ANR). In this paper, we describe the neighbor relations management and ANR, and evaluate ANR in a pre-launch, commercially deployed network cluster. The results indicate that ANR configures discovered and needed neighbor relations such that handover can be performed in combination with the neighbor relation establishment without dropping the connection.
Heterogeneous networks (HetNets) with a topology of mixed macro-cells and low-power nodes (LPNs) form an important step of capacity enhancement for LTE and LTE-A. In this paper, we present an optimization framework for load balancing in LTE HetNets, by means of cell range assignment using cell-specific offset. For any given offset setting, the resulting cell load is effectively approached via the solution of a system of non-linear equations characterizing the load-coupling relation between cells. We present a computationally efficient bounding scheme to approximate the solution of the non-linear system and provide theoretical insights into the monotonicity and convergence of the scheme. The bounding scheme is embedded into an algorithm based on the principle of design of experiments (DOE) for cell offset optimization. Simulation results demonstrate the effectiveness of the optimization process for LTE load balancing with HetNet elements.
Relaying is a feature defined in LTE Release 10 to provide coverage in new areas and/or to improve cell-edge throughput. For the purpose of investigating relay’s performance in a real network, an LTE TDD in-band relay prototype was developed. Based on this prototype some field measurements were conducted using LTE Release-8 terminals. Both indoor scenarios and outdoor scenarios were tested. Measurement results show that relays (once properly deployed) provide good coverage in the coverage holes of a donor eNB. Besides coverage extension, relays can also improve data rate in the poorly-covered area of a donor eNB, i.e. cell edge. The throughput of a terminal served by this relay prototype reaches around 8 Mbps in the uplink and 20 Mbps in the downlink. Regarding latency, given uplink data is always scheduled, the measured round-trip time via the relay is around 10 ms larger than that directly via the donor eNB.