在Spark中,可以使用DataFrameWriter
的jdbc
方法來刪除JDBC中的數(shù)據(jù)。具體的方法如下所示:
import org.apache.spark.sql._
val spark = SparkSession.builder()
.appName("Delete JDBC data")
.config("spark.master", "local")
.getOrCreate()
val jdbcUrl = "jdbc:mysql://localhost:3306/mydatabase"
val jdbcUsername = "username"
val jdbcPassword = "password"
val table = "my_table"
val condition = "id > 100"
val deleteQuery = s"DELETE FROM $table WHERE $condition"
val connectionProperties = new java.util.Properties()
connectionProperties.put("user", jdbcUsername)
connectionProperties.put("password", jdbcPassword)
val df = spark.read.jdbc(jdbcUrl, table, connectionProperties)
df.write.mode(SaveMode.Append).jdbc(jdbcUrl, table, connectionProperties)
上述代碼中,deleteQuery
是要執(zhí)行的DELETE語句,connectionProperties
包含了JDBC連接所需的用戶和密碼。然后,可以使用DataFrameWriter
的jdbc
方法將DELETE語句傳遞給JDBC以刪除數(shù)據(jù)。