Once you have the json file located at specific location you can read the column names as under but you need to have a better understanding of the json elements.
Using spark Sql :
val df = spark.read.option("multiline",true).json("/path/to/json")
df.createOrReplaceTempView("TestTable")
val selectedColumnsDf = spark.sql(""" Select meta.view.columns.id ,meta.view.columns.position, meta.view.createdAt from TestTable """)
Using DataFrame Api it can be done as below :
val df = spark.read.option("multiline",true).json("/path/to/json")
val selectedColumnsDf = df.select("meta.view.columns.id","meta.view.columns.position","meta.view.createdAt")
I am just selecting the three columns just to give you an idea. you can add remaining columns as per your requirement.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…