Channel Allocation Problem in Computer Network

Channel allocation is a process in which a single channel is divided and allotted to multiple users in order to carry user specific tasks. There are user’s quantity may vary every time the process takes place. If there are N number of users and channel is divided into N equal-sized sub channels, Each user is assigned one portion. If the number of users are small and don’t vary at times, than Frequency Division Multiplexing can be used as it is a simple and efficient channel bandwidth allocating technique.

Channel allocation problem can be solved by two schemes: Static Channel Allocation in LANs and MANs, and Dynamic Channel Allocation.







These are explained as following below.

1. Static Channel Allocation in LANs and MANs:
It is the classical or traditional approach of allocating a single channel among multiple competing users Frequency Division Multiplexing (FDM). if there are N users, the bandwidth is divided into N equal sized portions each user being assigned one portion. since each user has a private frequency band, there is no interface between users.

It is not efficient to divide into fixed number of chunks.

T = 1/(U*C-L)

T(FDM) = N*T(1/U(C/N)-L/N) 

Where,

T = mean time delay,
C = capacity of channel,
L = arrival rate of frames,
1/U = bits/frame,
N = number of sub channels,
T(FDM) = Frequency Division Multiplexing Time 

2. Dynamic Channel Allocation:
Possible assumptions include:

  1. Station Model:
    Assumes that each of N stations independently produce frames. The probability of producing a packet in the interval IDt where I is the constant arrival rate of new frames.
  2. Single Channel Assumption:
    In this allocation all stations are equivalent and can send and receive on that channel.
  3. Collision Assumption:
    If two frames overlap in time-wise, then that’s collision. Any collision is an error, and both frames must re transmitted. Collisions are only possible error.
  4. Time can be divided into Slotted or Continuous.
  5. Stations can sense a channel is busy before they try it.

Protocol Assumption:

  • N independent stations.
  • A station is blocked untill its generated frame is transmitted.
  • probability of a frame being generated in a period of length Dt is IDt where I is the arrival rate of frames.
  • Only a single Channel available.
  • Time can be either: Continuous or slotted.
  • Carrier Sense: A station can sense if a channel is already busy before transmission.
  • No Carrier Sense: Time out used to sense loss data.



My Personal Notes arrow_drop_up

Check out this Author's contributed articles.

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.




Article Tags :
Practice Tags :


Be the First to upvote.


Please write to us at contribute@geeksforgeeks.org to report any issue with the above content.