Skip to content
Advertisement

Spark: How to transpose and explode columns with dynamic nested arrays

I applied an algorithm from the question Spark: How to transpose and explode columns with nested arrays to transpose and explode nested spark dataframe with dynamic arrays.

I have added to the dataframe """{"id":3,"c":[{"date":3,"val":3, "val_dynamic":3}]}}""" , with new column c, where array has new val_dynamic field which can appear on random basis.

I’m looking for required output 2 (Transpose and Explode ) but even example of required output 1 (Transpose) will be very useful.

Input df:

Required output 1 (transpose_df):

Required output 2 (explode_df):

Current code:

Current outcome

ref : Transpose column to row with Spark

Advertisement

Answer

stack requires that all stacked columns have the same type. The problem here is that the structs inside of the arrays have different members. One approach would be to add the missing members to all structs so that the approach of my previous answer works again.

full_struct_df has now the schema

From here the logic works as before:

The first part of this answer requires that

  • each column mentioned in cols is an array of structs
  • all members of all structs are longs. The reason for this restriction is the cast(null as long) when creating the transform expression.
User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement