Spark DataFrame Filter using Binary (Array[Bytes]) data -


i have dataframe jdbc table hitting mysql , need filter using uuid. data stored in mysql using binary(16) , when querying out in spark converted array[byte] expected.

i'm new spark , have been trying various ways pass variable of type uuid dataframe's filter method. ive tried statements

val id: uuid = // other logic looks df.filter(s"id = $id") df.filter("id = " converttobytearray(id)) df.filter("id = " converttohexstring(id)) 

all of these error different messages. need somehow pass in binary types can't seem put finger on how properly.

any appreciated.

after reviewing more sources online, found way accomplish without using filter method.

when i'm reading sparksession, use adhoc table instead of table name, follows:

sparksession.read.jdbc(connectionstring, s"(select id, {other col omitted) mytable id = 0x$id) mytable", props) 

this pre-filters results me , work data frame need.

if knows of solution using filter, i'd still love know useful in cases.


Comments

Popular posts from this blog

python Tkinter Capturing keyboard events save as one single string -

android - InAppBilling registering BroadcastReceiver in AndroidManifest -

javascript - Z-index in d3.js -