scala - Custom ordering for sorting in rdd spark -


i have rdd of (string, long, int, string, string,list[integer], string, string, string, long, long) . want sort fields in tuple if _._1 equal move _._2 or else return result of first comparision. want continue till last element in tuple.

the below solution looks clumsy. there better way in scala ?

what trying

val customordering = new ordering[(string, long, int, string, string,       list[integer], string, string, string, long, long)] {       override def compare(a: (string, long, int, string, string,         list[integer], string, string, string, long, long),                            b: (string, long, int, string, string,                              list[integer], string, string, string, long, long)) = {           if (a._1.compare(b._1) == 0) {           if(a._2 == a._2){             ...            }           else if(a._2 < a._2) {               1             }           else{             0           }         }         else if(a._1.compare(b._1) < 0 ){           1         }         else {           0         }       }     } 

i'd try approach converting tuples sequences, zipping them , first item comparation non-zero.

it like:

first.productiterator.toseq.zip(second.productiterator.toseq).find(   {case (x, y) => x.compare(y) != 0} ) match {   case some(tuple) => {      if (tuple._1.compare(tuple._2) < 0) 1 else 0   }   case none => ??? } 

Comments