OFDM frequency sensitivity (MATLAB related)

Click For Summary
SUMMARY

This discussion focuses on simulating an OFDM (Orthogonal Frequency Division Multiplexing) system using MATLAB to analyze the Bit Error Rate (BER) in relation to frequency offset. The user references a specific MATLAB script for generating the offset vs. BER curve, highlighting discrepancies in their own implementation. The provided code outlines the OFDM parameters, transmitter, and receiver processes, but the user reports that their BER curve does not exhibit the expected steep increase near zero frequency offset. Key adjustments to the code are necessary to correct this behavior.

PREREQUISITES
  • Understanding of OFDM principles and parameters
  • Familiarity with MATLAB programming and simulation
  • Knowledge of Bit Error Rate (BER) calculations
  • Experience with signal processing concepts, particularly in relation to frequency offset
NEXT STEPS
  • Review MATLAB's 'awgn' function for accurate noise modeling in simulations
  • Investigate the impact of cyclic prefix length on BER performance
  • Learn about frequency offset compensation techniques in OFDM systems
  • Explore MATLAB's 'modem' toolbox for advanced modulation and demodulation methods
USEFUL FOR

This discussion is beneficial for signal processing engineers, MATLAB developers, and researchers focused on wireless communications, particularly those working on OFDM system simulations and performance optimization.

O.J.
Messages
198
Reaction score
0
Hello everyone,

I am trying to simulate an OFDM system and part of what I want to do is investigate the Bit Error Rate with respect to frequency offset. I understand how the offset vs BER curve should be looking like the curve obtained in the code in this link http://www.dsplog.com/db-install/wp-content/uploads/2009/08/script_frequency_offset_ofdm.m which is available on this page:
http://www.dsplog.com/2009/08/08/effect-of-ici-in-ofdm/
All u have to do is run the code in MATLAB and you will see the correct offset vs error plot.

I am developing a code with which the offset vs BER plot seems incorrect. By that I mean, the error starts growing slowly from the 0 offset (no offset) and slowly grows (like a parabola) when the slope of the curve should be steepest (theoretically) just above or below zero offset and then should start to slow down approaching +/- 0.5 offset.

I cannot understand what I should modify in my code to address this issue. Here is the code I am using:
%% OFDM parameters
N = 2048; %number of carriers
Tu = 0.001; %symbol period for each carrier
T = Tu/N; %Symbol period of serial stream
R = 1/8; %guard time ratio (WiMAX standard) R*N should be integer
G = Tu*R; %guard time
fu = 1/Tu; %symbol rate of each datastream
Fs = fu*N; %sampling frequency
Q = 4; %upsampling factor



%% TRANSMITTER

%generate message
Mt = randint(1,N);
tn = 0:N-1;
stem(tn/Fs,Mt); xlabel('Time (seconds)'); ylabel('Binary Data');

%BPSK (mapping 0's to -1's)
M1=2*Mt-1;
stem (tn,M1); %plot the original message (BPSK)
xlabel('Time (seconds)'); ylabel('BPSK Data');

%S/P conversion
A = reshape(M1,N,1);

%modulate each subcarrier using IFFT
B = ifft(A);

%P/S conversion
C = reshape(B,1,N);

%guard time/cyclic prefixing
Guard = zeros(1,length(1:R*N));
C_gcp = [C Guard];
C_gcp(N+1:N+R*N) = C_gcp(1:R*N);

%oversample by a factor of Q (DAC ~)
D = ones(Q,1)*C_gcp;
D = conj(D(:)');

%plot the OFDM signal
t = 0:Q*(N+R*N)-1;
plot (t/(Q*Fs),D);xlabel ('Time (seconds)');ylabel ('OFDM Signal');title ('OFDM Symbol (with guard time & cyclic prefix) vs Time');SNR = 10;
BER = 0;
BER_t = 0;
N_Max = 50;
S = [];
f0 = 1/Tu;
offset = -0.5:0.1:0.5;
t = 0:Q*(N+N*R)-1;

for tr = 1:length(offset);
for n = 1:N_Max

%% RECEIVER

%noise
D_n = awgn(D,SNR,'measured');

%frequency offset
D_n_o = D_n.*exp(sqrt(-1)*2*pi*offset(tr)*f0*t/(Q*Fs));

%Downsampling
E = [];
for j =1:N+R*N
E(j) = D_n_o(j*Q);
end

%Removing CP
F = zeros(1,N);
F(1:N) = E(1:N);

%S/P
M2 = fft(F);

%Demapping
h = modem.pskdemod(2,pi);
Mr = demodulate(h,M2);

%Bit Error Computation
Error = length(find(Mt~=Mr));
Error_Rate = Error/length(Mt);
BER_t = BER_t + Error_Rate;
end
BER = BER_t/N_Max;
S = [S BER];
BER_t = 0;
end

plot(offset,S); xlabel('Normalized Offset'); ylabel('Bit Error Rate');
grid on
 
Physics news on Phys.org
bump...
 

Similar threads

Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 16 ·
Replies
16
Views
1K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
9
Views
4K