spark如何連接mysql數(shù)據(jù)庫(kù)

小億
209
2024-05-06 19:49:53
欄目: 云計(jì)算

在Spark中連接MySQL數(shù)據(jù)庫(kù)有兩種方式:

  1. 使用JDBC連接:
import org.apache.spark.sql.SparkSession

val spark = SparkSession.builder()
  .appName("MySQLExample")
  .getOrCreate()

val url = "jdbc:mysql://hostname:port/databaseName"
val table = "tableName"
val properties = new Properties()
properties.put("user", "username")
properties.put("password", "password")

val df = spark.read.jdbc(url, table, properties)
df.show()
  1. 使用MySQL Connector for Apache Spark:

首先需要在spark-submit命令中添加MySQL Connector的jar包路徑:

spark-submit --jars /path/to/mysql-connector-java.jar --class your_class your_jar.jar

然后在代碼中使用MySQL Connector連接MySQL數(shù)據(jù)庫(kù):

import org.apache.spark.sql.SparkSession

val spark = SparkSession.builder()
  .appName("MySQLExample")
  .getOrCreate()

val url = "jdbc:mysql://hostname:port/databaseName"
val table = "tableName"
val properties = new Properties()
properties.put("user", "username")
properties.put("password", "password")

val df = spark.read.format("jdbc")
  .option("url", url)
  .option("dbtable", table)
  .option("user", properties.getProperty("user"))
  .option("password", properties.getProperty("password"))
  .load()

df.show()

以上是兩種連接MySQL數(shù)據(jù)庫(kù)的方式,可以根據(jù)需要選擇適合自己的方法。

0