java - Databricks spark-csv check for empty file -


i'm trying read in tsv file dataframe object following code:

sqlcontext sqlcontext = new sqlcontext(javasparkcontext); map<string, string> sqlcontextoptions = new hashmap<>(); sqlcontextoptions.put("header", "true"); sqlcontextoptions.put("delimiter", "\t"); dataframe df = sqlcontext.read()         .format("com.databricks.spark.csv")         .options(sqlcontextoptions)         .load(path); 

right now, code throws unsupportedoperationexception if encounters empty file. want handle empty files, don't want assume exception mean empty file. best practice checking if given file empty?

i don't see path explicitly defined, i'm assuming it's string containing path file. if that's case, open in bufferedreader object , check if can read it.

bufferedreader br = new bufferedreader(new filereader(path));      if (br.readline() == null) {     // handle empty file... } else {     //do something... } 

Comments