In Databricks, you can see the contents of local files on the driver node and DBFS files using different methods. Here’s how to do it.
Viewing Local Files on the Driver Node
Local files are stored on the driver node, typically in /databricks/driver/.
1. Using %sh Magic Command
%sh
cat /databricks/driver/your-local-file.txt
- Replace
your-local-file.txtwith the name of your file. - This command uses shell access to read the file’s content.
2. Using Python
with open('/databricks/driver/your-local-file.txt', 'r') as file:
contents = file.read()
print(contents)
3. List Files in the Directory
%sh
ls -l /databricks/driver/
Viewing Files in DBFS
Files in DBFS are stored under paths like /dbfs/.
1. Using %fs Magic Command
%fs head /dbfs/mnt/your-folder/your-file.txt
- The
headcommand displays the first few lines of the file.
2. Using Databricks Utilities
# List files in a DBFS directory
dbutils.fs.ls('/mnt/your-folder/')
# Read file contents
dbutils.fs.head('/mnt/your-folder/your-file.txt', 100) # Read first 100 bytes
3. Using Python File I/O with /dbfs/ Path
with open('/dbfs/mnt/your-folder/your-file.txt', 'r') as file:
contents = file.read()
print(contents)
Viewing Parquet or CSV Files in DBFS
If the file is in a format like Parquet or CSV, you can use Spark to read it:
Read a CSV File
df = spark.read.csv('/mnt/your-folder/your-file.csv', header=True, inferSchema=True)
df.show()
Read a Parquet File
df = spark.read.parquet('/mnt/your-folder/your-file.parquet')
df.show()
Accessing File Contents via Databricks UI
- Go to the Data tab in Databricks Workspace.
- Navigate to the DBFS section to browse files visually.
- Click on a file to preview or download its content.
You can easily access and manage files in Databricks using these methods.






