Use pip or apt-get to install BeautifulSoup in Python. Fix errors during installation by following commands provided here.

Install the BeautufulSoup parser in Linux python easily by giving the below commands.

Method:1

$ apt-get install python3-bs4 (for Python 3)

Method:2

$ pip install beautifulsoup4

Note: If you don’t have easy_install or pip installed
$ python setup.py install

How to Fix Syntax Error After Installation

Here it is about setup.py.

$ python3 setup.py install
or,
convert Python2 code to Python3 code
$ 2to3-3.2  -w  bs4

How to install lxml

BeautifulSoup is a standard parser in Python3 for HTML tags. You can also download additional parser.

$ apt-get install python-lxml
or
$ easy_install lxml
or
$ pip install lxml

How to Install html5lib

$ apt-get install python-html5lib
or
$ easy_install html5lib
or
$ pip install html5lib
How beautifulsoup works
How beautifulsoup works

How do I Remove HTML Tags in Web data

You have supplied two arguments for BeautifulSoup. One is fp and the other one is html.parser. Here, the parsing method is html.parser. You can also use xml.parser.

Python Code

from bs4 import BeautifulSoup
with open("index.html") as fp:
soup = BeautifulSoup(fp, 'html.parser')
soup = BeautifulSoup("<html>a web page</html>", 'html.parser')
print(BeautifulSoup("
<html>
<head>
</head>
<body>
<p>
Here's a paragraph of text!
</p>
<p>
Here's a second paragraph of text!
a</body>
</html>", "html.parser"))

The Output

Here's a paragraph of text!
Here's a second paragraph of text!

You May Also Like: BeautifulSoup Tutorial

Latest from the Blog

Azure Data Factory (ADF): The Complete Beginner-Friendly Guide (2026 Edition)

Azure Data Factory (ADF) is Microsoft’s fully managed, cloud-based data integration and orchestration service. It helps you collect data from different sources, transform it at scale, and load it into your preferred analytics or storage systems. Whether you are working with Azure SQL, on-premises databases, SaaS applications, or big-data systems, ADF gives you a unified…

AWS SageMaker + S3 Tutorial: Build, Train, and Deploy a LiDAR ML Model

This end-to-end tutorial shows how to upload LiDAR images to AWS S3, preprocess point cloud data, train an ML model in Amazon SageMaker, deploy the model, and store prediction outputs back in S3. Includes clear practical steps for beginners and ML engineers.

12 Top Python Coding Interview Questions

Useful for your next interview.