i trying load data redshift in spark using spark-redshift in java.
//dataframe iplocation = sqlcontext.read().schema(customschema).json(jsonfile.getfile().getpath()); dataframe iplocation = sqlcontext.read() .format("com.databricks.spark.redshift") .option("url", "jdbc:redshift://test.us-east-1.redshift.amazonaws.com:5439/spay?user=&password=") .option("query", "select * test.iplocation") .option("tempdir", "s3n://test/") .load();
pom dependecies added
<dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-core_2.10</artifactid> <version>1.6.1</version> <!-- <scope>provided</scope> --> </dependency> <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-streaming_2.10</artifactid> <version>1.6.1</version> <!-- <scope>provided</scope> --> </dependency> <dependency> <groupid>org.apache.spark</groupid> <artifactid>spark-sql_2.10</artifactid> <version>1.6.1</version> </dependency> <!-- https://mvnrepository.com/artifact/com.databricks/spark-redshift_2.10 --> <dependency> <groupid>com.databricks</groupid> <artifactid>spark-redshift_2.10</artifactid> <version>0.6.0</version> </dependency>
i getting following error
java.lang.reflect.invocationtargetexception @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) @ java.lang.reflect.method.invoke(method.java:498) @ com.databricks.spark.redshift.jdbcwrapper.registerdriver(redshiftjdbcwrapper.scala:59) @ com.databricks.spark.redshift.redshiftrelation.schema(redshiftrelation.scala:56) @ org.apache.spark.sql.execution.datasources.logicalrelation.<init>(logicalrelation.scala:37) @ org.apache.spark.sql.dataframereader.load(dataframereader.scala:125) @ com.samsung.cloud.mopay.trsanalytics.test.etl.spark.testsparkposttoken.testspark(testsparkposttoken.java:69)
am missing dependencies here? spark-redshift support java? using wrong version here ?
Comments
Post a Comment