Description
I'm currently writing a dataframe/RDD consisting of <Key: String, Data: Array[Byte]>.
I have the scenario where we have a hash representing a entity and some of the data is loaded through a spark job and some is loaded by an API.
The API also need to read the data that is loaded by the spark job.
We use data as byte array to reduce it's size, the data itself is compressed and encoded.
We are using Jedis to read the data and also run some tests doing direct load which worked fine.
Below the first result is doing direct load, which is then returning the correct value and the second is using Dataframe load.
From #205 I saw that byte array Lists was implemented, is it possible to have the same for Hashes? Or is there an workaround for it?
Thanks in advance.