Abstract—A novel minimumbit-errorrate (MBER) space–time- equalization (STE)-based multiuser detector (MUD) is proposed for multiple-receive-antenna-assisted space-division multiple-ac- cess systems. It is shown that the MBER-STE-aided MUD significantly outperforms the standard minimum mean-square error design in terms of the achievable bit-errorrate (BER). Adaptive implementations of the MBER STE are considered, and both the block-data-based and sample-by-sample adaptive MBER algorithms are proposed. The latter, referred to as the least BER (LBER) algorithm, is compared with the most popular adaptive algorithm, known as the least mean square (LMS) algorithm. It is shown that in case of binary phase-shift keying, the computational complexity of the LBER-STE is about half of that required by the classic LMS-STE. Simulation results demonstrate that the LBER algorithm performs consistently better than the classic LMS algorithm, both in terms of its convergence speed and steady-state BER performance.
However, as recognised in [3–5], a better strategy is to choose the linear detector’s coefficients so as to directly min- imise the error-probability or bit-error ratio (BER), rather than the mean-squared error (MSE). This is because minimising the MSE does not necessarily guarantee that the BER of the sys- tem is also minimised. The family of detectors that directly minimises the BER is referred to as the minimumbit-errorrate (MBER) detector class [3–6]. In [7] we have derived the exact MBER MUD weight calculation for the uplink SDMA OFDM system. We have also shown that the MBER MUD may signif- icantly outperform the MMSE MUD in terms of the achievable BER in a two-user OFDM scenario.
This paper considers interference limited communication systems where the desired user and in- terfering users are symbol-synchronized. A novel adaptive beamforming technique is proposed for quadrature phase shift keying (QPSK) receiver based directly on minimizing the biterrorrate. It is demonstrated that the proposed minimumbiterrorrate (MBER) approach utilizes the system re- source (antenna array elements) more intelligently, than the standard minimum mean square error (MMSE) approach. Consequently, an MBER beamforming assisted receiver is capable of providing significant performance gains in terms of a reduced biterrorrate over an MMSE beamforming one. A block-data based adaptive implementation of the theoretical MBER beamforming solution is de- veloped based on the classical Parzen window estimate of probability density function. Furthermore, a sample-by-sample adaptive implementation is also considered, and a stochastic gradient algorithm, called the least biterrorrate, is derived for the beamforming assisted QPSK receiver.
Space Division Multiple Access (SDMA) is capable of sub- stantially increasing the achievable system capacity with the aid of employing antenna arrays [1, 2]. In recent years, Or- thogonal Frequency Division Multiplexing (OFDM) has at- tracted intensive research interests owing to its numerous ben- efits, such as for example that of converting frequency selec- tive channels to parallel non-dispersive flat fading channels [2]. Combining these two techniques has the promise of achieving reliable wireless communications at high data rates with the aid of efficient MUD algorithms. The family of the MinimumBitErrorRate (MBER) MUDs has the potential of outperforming the classic Minimum Mean Square Error (MMSE) receivers, since it directly minimizes the BitError Ratio (BER), rather
[8] S. Chen, L. Hanzo, and N. N. Ahmad, “Adaptive MinimumBitErrorRate Beamforming Assisted Receiver for Wireless Commu- nications,” in Proceedings of IEEE International Conference of Acoustics, Speech and Signal Processing (ICASSP), Hong Kong, China, 2003, vol. IV, pp. 640–643.
In broadband wireless communication, orthogonal frequency division multiplexing (OFDM) is used as a multi-carrier technique to combat the inter-symbol interference (ISI). Adaptive array antenna (AAA) can be combined with OFDM to reduce the effect of directional interferences. The optimum beamformer weight set is based on minimumbiterrorrate (MBER) criteria in pilot-assisted OFDM systems. The development of a block-data adaptive implementation of the MBER beamforming solution is based on to the Parzen window estimates of probability density function. The Gradient Newton algorithm has been proposed to enhance the performance and increase the convergence rate speed, but at the expense of complexity. In this paper a block processing objective function for the MBER is formatted in three beamfroming algorithm, Least MBER (LMBER), Newton Least MBER (NLMBER), and Block-Shanno MBER (BSMBER) are proposed on Pre-FFT OFDM system. Simulation results showed that the BSMBER algorithm structure had the lowest computational complexity, the best BER performance and the fastest convergence rate over the other algorithms.
• An adaptive beamforming assisted multiuser detection scheme based on the minimum bit error rate design has been derived for multiple receive antennas aided SDMA systems • The minimum b[r]
A novel adaptive beamforming technique is proposed for wireless communication with quadrature phase shift keying signalling based on the minimumbiterrorrate (MBER) criterion. It is shown that the MBER approach provides significant performance gain in terms of smaller biterrorrate over the standard minimum mean square error approach. Using the classical Parzen window estimate of proba- bility density function, both the block-data and sample-by-sample adaptive implementations of the MBER solution are developed.
Abstract: Adaptive filtering has traditionally been developed based on the minimum mean square error (MMSE) principle and has found ever-increasing applications in communications. The paper develops adaptive filtering based on an alternative minimumbiterrorrate (MBER) criterion for communication applications. It is shown that the MBER filtering exploits the non-Gaussian distribution of filter output effectively and, consequently, can provide significant performance gain in terms of smaller biterrorrate (BER) over the MMSE approach. Adopting the classical Parzen window or kernel density estimation for a probability density function (pdf), a block-data gradient adaptive MBER algorithm is derived. A stochastic gradient adaptive MBER algorithm is further developed for sample-by-sample adaptive implementation of the MBER filtering. Extension of the MBER approach to adaptive nonlinear filtering is also discussed.
However, as recognized in [3]–[5] in a code division multiple- access (CDMA) context, a better strategy is to choose the linear detector’s coefficients so as to directly minimize the error-probability or bit-error ratio (BER), rather than the mean- squared error (MSE). This is because minimizing the MSE does not necessarily guarantee that the BER of the system is also minimized. The family of detectors that directly minimizes the BER is referred to as the minimumbit-errorrate (MBER) de- tector class [3]–[6]. The MBER criterion has been successfully applied in the linear equalization of binary signaling [4], for decision feedback equalization (DFE) [7], and in linear MIMO receivers [6]. It has also been shown that the MBER detector can be used effectively for linear multiuser detection in CDMA systems [3], [5].
where the extrinsic information is gleaned from the surrounding RSC encoded bits, excluding the specific bit considered [5]. We note that as usual in joint iterative detection and decoding schemes [5], we exchange the extrinsic information concerning both the original information bits and parity bits, rather than only that of the information bits, although only the LLRs of the latter are needed in the classic turbo decoder of Berrou et al. [3]. After interleaving, the extrinsic information delivered by the channel decoders is then fed back to the SISO multiuser detector, as the a priori information concerning the RSC encoded bits of all the users for exploitation during the next iteration.
T HE ever-increasing demand for mobile communication capacity has motivated the employment of space-divi- sion multiple access for the sake of improving the achievable spectral efficiency. A particular approach that has shown real promise in achieving substantial capacity enhancements is the use of adaptive antenna arrays [1]–[10]. Adaptive beam- forming is capable of separating signals transmitted on the same carrier frequency, provided that they are separated in the spatial domain. A beamformer appropriately combines the signals received by the different elements of an antenna array to form a single output. Classically, this has been achieved by minimizing the mean square error (MSE) between the desired output and the actual array output. This principle has its roots in the traditional beamforming employed in sonar and radar sys- tems. Adaptive implementation of the minimum MSE (MMSE) beamforming solution can be realized using temporal reference techniques [2]–[4], [11]–[14]. Specifically, block-based beam- former weight adaptation can be achieved, for example, using the sample matrix inversion (SMI) algorithm [11], [12], while sample-by-sample adaptation can be carried out using the least mean square (LMS) algorithm [13], [14].
As far as the channel-state information is concerned, we assumed no channel-state information to be available at the receiver side, thus channel equalization was preceded by an adaptive channel identifier algorithm given in (35). In all simulations the delay parameter D used in the decision rule (3) was set by exhaustive search. The step size of the gradient-descent-type algorithms was set empirically. The experiments show that the attained BER is not too sensitive to the value of the step size, while the convergence speed is highly dependent on this value as described below. Furthermore the value of the step size depends on the SNR as well, since the error surface tends to be more complicated as SNR increases [6].
Trading implementation complexity in a cognitive radio transceiver against high-resolution spectrum sensing and minimumbit-errorrate performance in spectrum access can be taken into account in the design of suitable DFT modulated filter banks. The system specifications translate into a constrained optimization procedure for finding corresponding prototype filter coefficients. If the design is applied to a system with characteristics similar to IEEE802.11g, the system specifications can be met by properly treating interference phenomena arising from both third parties as well as from self-interference in the filter bank. The transceiver complexity can benefit from the polyphase implementation of DFT modulated filter banks.
Neural Network based smart antennas are capable of improving the achievable wireless system capacity and quality by suppressing the effects of both inter-symbol interference (ISI) and co-channel interference (CCI). This paper consider a space-division multiple access (SDMA) uplink scheme, where each transmitter employs a single antenna, while the base station (BS) receiver has multiple antennas Chen et al (2004). In a CDMA system, each user is separated by a unique user-specific spreading code. By contrast, an SDMA system differentiates each user by the associated unique user specific channel impulse response (CIR) encountered at the receiver antennas. In this analogy, the unique user-specific CIR plays the role of a user- specific CDMA signature. However, owing to the non-orthogonal nature of the CIRs, an effective multiuser detection (MUD) is required for separating the users in an SDMA system Winters et al (1994); Tsoulos et al (1997); Blogh et al (2002); Hanzo et al (2003). Neural networks have recently been used in the design of multiuser receivers for SDMA systems. Neural Network based receivers employing the Widrow-Hoff criterion usually show good performance and have simple adaptive implementation, at the expense of a higher computational complexity John (1998); Widrow and Lehr (1990); Widrow and Winter (1988); Zuradha (2006). The deployment of non-linear structures, such as neural networks, can mitigate more effectively inter-symbol interference, caused by the multipath effect of radio signals and multiple access interference, which arises due to the non-orthogonality between users’ signals. In the last few years, different artificial neural networks structures have been used in the design of multiuser detectors (MUD). These neural systems make use of non-linear functions to create decision boundaries to detect transmitted symbols, whilst conventional (MUDs) employ linear functions to form such decision regions. In addition, the bulk of previously reported neural and linear receivers are based upon the WIDROW-HOFF criterion, since this approach usually shows good performance and has simple adaptive implementation. However, the AMBER methodology has not been proposed for SDMA networks and hence the present work throws light on this area. In this paper, a similar approximate minimumbiterrorrate approach to adaptive multiuser receivers using dynamic neural networks based on Adaptive learning algorithm has been proposed.
Bluetooth can provide a bitrate equal to 1Mbps. A FHSS scheme is used at the physical level; each master chooses a different hopping sequence so that pioneers can operate in the same area without interfering with each other. Hopping frequencies range over 79 frequency channels in the ISM band, each of the channels being 1MHz wide. The nominal hop dwell time is equal to 625 s. Sequences are created by generating several sub sequences, each composed of 32 hops. The first sub sequence is obtained by taking 32 hops at random over the first 64MHz of the frequency spectrum; then the successive 32MHz are skipped, and the next sub-sequence is randomly chosen among the following 64MHz. The procedure is repeated until the hopping sequence is completed [4]. A TDD technique is used to transmit and receive data in a piconet: each packet transmitted in a slot corresponds to the minimum dwell time; slots are centrally allocated by the master and alternately used for master and slave transmissions.
The bit error rate BER performance of the distributed antenna cellular DS-CDMA system is investigated, when minimum mean-square error MMSE multiuser detection is employed, and when trans[r]
Multiple transmit and receive antennas can be used to form multiple-input multiple-output (MIMO) channels to increase the capacity by a factor of the minimum number of transmit and receive antennas. In this paper, orthogonal frequency division multiplexing (OFDM) for MIMO channels (MIMO-OFDM) is considered for wide band transmission to mitigate inter symbol interference and enhance system capacity. MIMO (Multiple Input Multiple Output) is a diversity technique and one of the most recent contributions in the communication field. MIMO improves the received signal quality and increase the data communication speed by using various digital processing technique to combine the signal via various wireless path i.e. through multiple transmit and receive antennas. This type of configuration is quite complex and consume lots of power. Therefore some schemes have been proposed to reduce the system complexity. In wireless communication there occurs frequency selective fading in channel, this results in the intersymbol interference that can degrade the communication quality. So to overcome with this problem, OFDM system is used.