Difference Between Audio & AudioContext in HTML5
Last Updated :
13 Mar, 2024
Audio is an HTML5 element used to embed sound content in documents. It’s a simple way to add audio to a webpage, but it lacks advanced features like real-time processing and effects. AudioContext, on the other hand, is a powerful API that provides more control over audio playback, including real-time processing, effects, and more. It’s suitable for creating interactive audio applications and games.
Using the <audio> tag
The <audio> element lets you easily add audio files to your webpage, making them playable directly in the browser. It’s a simple way to incorporate sound into your web content without hassle.
Syntax:
<audio src="audio_file.mp3" controls></audio>
Example: Implementation to showcase the use of an audio tag.
HTML
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport"
content="width=device-width, initial-scale=1.0">
<title>Using Audio Element</title>
</head>
<body>
<audio src=
"https://media.geeksforgeeks.org/wp-content/uploads/20240218213800/StarWars60.wav"
controls>
</audio>
</body>
</html>
Output:
Using the Web Audio API and AudioContext
The Web Audio API, with its AudioContext object, lets developers do more with sound on web pages than just playing audio files.With AudioContext, developers can create complex audio effects, control audio playback, and even synthesize audio from scratch, making it a powerful tool for creating interactive audio experiences on the web.
Syntax:
let audioContext = new AudioContext( );
Example: Implementation to showcase the use of an audioContext.
HTML
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport"
content="width=device-width, initial-scale=1.0">
<title>AudioContext</title>
</head>
<body>
<button onclick="playSound()">Play Sound</button>
<script>
function playSound() {
// Create an AudioContext object
const audioContext = new AudioContext();
// Create an oscillator node
const oscillator = audioContext.createOscillator();
// Connect the oscillator to the audio output (speakers)
oscillator.connect(audioContext.destination);
// Set the frequency of the oscillator (440 Hz = A4 note)
oscillator.frequency.setValueAtTime(440, audioContext.currentTime);
// Start the oscillator
oscillator.start();
// Stop the oscillator after 2 seconds
setTimeout(() => {
oscillator.stop();
}, 2000);
}
</script>
</body>
</html>
Output : This code creates a simple HTML page with a button. When the button is clicked, the playSound() function is called, which creates an AudioContext object, creates an OscillatorNode, connects it to the audio output, sets its frequency to 440 Hz (A4 note), starts the oscillator, and then stops it after 2 seconds.
You can copy and paste this code into an HTML file and open it in a web browser to see it in action. When you click the “Play Sound” button, you should hear a short beep.
Difference between Audio and AudioContext in HTML5
Feature | Audio Element | AudioContext API |
---|
Basic Playback | Simple tool for adding sound to web pages. | Provides more control over audio playback, including real-time processing, effects, synthesis, and more. |
Real-time Control | Does not support real-time control. | Supports real-time control, allowing for dynamic changes to audio playback. |
Audio Effects | Does not support audio effects. | Supports audio effects, enabling the addition of reverb, delay, and other effects. |
Audio Synthesis | Does not support audio synthesis. | Supports audio synthesis, allowing for the creation of sound from scratch. |
Advanced Control | Does not support advanced control. | Supports advanced control, including scheduling, routing, and more. |
Share your thoughts in the comments
Please Login to comment...