pyspark連接mysql數(shù)據(jù)庫報錯的解決
使用pyspark連接mysql數(shù)據(jù)庫代碼如下
spark_conf = SparkConf().setAppName("MyApp").setMaster("local") spark = SparkSession.builder.config(conf=spark_conf).getOrCreate() url = "jdbc:mysql://localhost:3306/test?useUnicode=true&characterEncoding=UTF-8&useSSL=false" table_name = "tab_tf" properties = { "user": "root", "password": "root" } # 讀取 MySQL 數(shù)據(jù)庫中的數(shù)據(jù) df = spark.read.jdbc(url=url, table=table_name, properties=properties) # 展示數(shù)據(jù) df.show()
執(zhí)行時報錯了,錯誤信息如下:
py4j.protocol.Py4JJavaError: An error occurred while calling o32.jdbc.
: java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:315)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:105)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$6.apply(JDBCOptions.scala:105)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:104)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:332)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:242)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:230)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:186)
at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:257)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
經(jīng)查詢,是因為spark中缺少連接MySQL的驅動程序,于是乎下載了與自己mysql數(shù)據(jù)庫版本一致的jar包,下載地址:https://downloads.mysql.com/archives/c-j/
查詢mysql版本命令:mysql -V
下載完成后,解壓,將mysql-connector-java-8.0.30.jar拷貝到spark安裝目錄的libs中
重新執(zhí)行程序,問題解決,執(zhí)行結果如下:
參考:py4j.protocol.Py4JJavaError: An error occurred while calling o32.jdbc.
到此這篇關于pyspark連接mysql數(shù)據(jù)庫報錯的解決的文章就介紹到這了,更多相關pyspark連接mysql內容請搜索腳本之家以前的文章或繼續(xù)瀏覽下面的相關文章希望大家以后多多支持腳本之家!
相關文章
Python 抓取數(shù)據(jù)存儲到Redis中的操作
這篇文章主要介紹了Python 抓取數(shù)據(jù)存儲到Redis中的操作,具有很好的參考價值,希望對大家有所幫助。一起跟隨小編過來看看吧2020-07-07Python線程池模塊ThreadPoolExecutor用法分析
這篇文章主要介紹了Python線程池模塊ThreadPoolExecutor用法,結合實例形式分析了Python線程池模塊ThreadPoolExecutor的導入與基本使用方法,需要的朋友可以參考下2018-12-12