Create PySpark dataframe from nested dictionary
Last Updated :
17 Jun, 2021
In this article, we are going to discuss the creation of Pyspark dataframe from the nested dictionary.
We will use the createDataFrame() method from pyspark for creating DataFrame. For this, we will use a list of nested dictionary and extract the pair as a key and value. Select the key, value pairs by mentioning the items() function from the nested dictionary
[Row(**{'': k, **v}) for k,v in data.items()]
Example 1:Python program to create college data with a dictionary with nested address in dictionary
Python3
import pyspark
from pyspark.sql import SparkSession
from pyspark.sql import Row
spark = SparkSession.builder.appName( 'sparkdf' ).getOrCreate()
data = {
'student_1' : {
'student id' : 7058 ,
'country' : 'India' ,
'state' : 'AP' ,
'district' : 'Guntur'
},
'student_2' : {
'student id' : 7059 ,
'country' : 'Srilanka' ,
'state' : 'X' ,
'district' : 'Y'
}
}
rowdata = [Row( * * {'': k, * * v}) for k,
v in data.items()]
final = spark.createDataFrame(rowdata).select(
'student id' , 'country' , 'state' , 'district' )
final.show()
|
Output:
+----------+--------+-----+--------+
|student id| country|state|district|
+----------+--------+-----+--------+
| 7058| India| AP| Guntur|
| 7059|Srilanka| X| Y|
+----------+--------+-----+--------+
Example 2: Python program to create nested dictionaries with 3 columns(3 keys)
Python3
import pyspark
from pyspark.sql import SparkSession
from pyspark.sql import Row
spark = SparkSession.builder.appName( 'sparkdf' ).getOrCreate()
data = {
'student_1' : {
'student id' : 7058 ,
'country' : 'India' ,
'state' : 'AP'
},
'student_2' : {
'student id' : 7059 ,
'country' : 'Srilanka' ,
'state' : 'X'
}
}
rowdata = [Row( * * {'': k, * * v}) for k, v in data.items()]
final = spark.createDataFrame(rowdata).select(
'student id' , 'country' , 'state' )
final.show()
|
Output:
+----------+--------+-----+
|student id| country|state|
+----------+--------+-----+
| 7058| India| AP|
| 7059|Srilanka| X|
+----------+--------+-----+
Like Article
Suggest improvement
Share your thoughts in the comments
Please Login to comment...