COMPUTER NETWORK AND CYBERSECURITY
Q5 Explain the difference between single-bit errors and bursterrors in error control in communications systems. (3 marks)
(a) If a noise event causes a burst error to occur that lastsfor 0.1 ms (millisecond) and data is being transmitted at 100Mbps,how many data bits will be affected? (3 marks)
(b) Under what circumstances is the use of parity bits anappropriate error control technique?
Expert Answer
An answer will be send to you shortly. . . . .