scala - Unable to rename Dataframe's column -


when showing partitions of table, using spark 1.6.3, unable rename dataframe's column. in order reproduce problem, build following partitioned table

sqlcontext.sql("create table test (foo string) partitioned (dt string)")  val df = sqlcontext.sparkcontext.parallelize(seq(("foo", 1), ("bar", 2)))   .todf("column", "dt")  df.write.mode("overwrite").partitionby("dt").insertinto("test") 

i can retrieve list of available partitions in table

scala> sqlcontext.sql(s"show partitions test").show  +------+ |result| +------+ |  dt=1| |  dt=2| +------+ 

now, rename column using withcolumnrenamed this

sqlcontext.sql(s"show partitions test").withcolumnrenamed("result", "partition").show 

which fails following error message

org.apache.spark.sql.analysisexception: resolved attribute(s) result#835 missing result#826 in operator !project [result#835 partition#836];

i can around problem using simple alias

> sqlcontext.sql(s"show partitions test").select($"result".as("partition")).show  +---------+ |partition| +---------+ |     dt=1| |     dt=2| +---------+ 

but why fail in first place ?


Comments

Popular posts from this blog

javascript - Clear button on addentry page doesn't work -

c# - Selenium Authentication Popup preventing driver close or quit -

tensorflow when input_data MNIST_data , zlib.error: Error -3 while decompressing: invalid block type -