json - Scala Pickling 似乎不适用于 Point2D.Double

标签 json scala scala-pickling

我正在开发一个使用 Scala Pickling library 的 Scala 程序序列化和反序列化 Map包含 String 的对象和一个 Point2D.Double来自 java.awt.geom 的对象包裹。

相关逻辑如下:

contents +=
      new Button("Save Config") {
        reactions += {
          case ButtonClicked(_) => {
            var m: Map[String, Point2D.Double] = Map()
            nodeFields.foreach(x => {
              m += (x._1 -> new Point2D.Double(x._2._1.text.toDouble, x._2._2.text.toDouble))
            })
            val pkl = m.pickle
            fc.showSaveDialog(null)
            val outputFile = fc.selectedFile
            val writer = new PrintWriter(outputFile)
            writer.write(pkl.value)
            writer.close()
            Dialog.showMessage(null, "Success!")
          }
        }
      }

如果您需要查看更多,here's the commit with the offending logic

就目前情况而言,pkl.value 输出的 JSON 格式字符串是一个工作序列化Map[String, Point2D.Double] ,除了 Point2D.Double 的值被丢弃了!

这是输出的片段:

{
  "$type": "scala.collection.mutable.Map[java.lang.String,java.awt.geom.Point2D.Double]",
  "elems": [
    {
    "$type": "scala.Tuple2[java.lang.String,java.awt.geom.Point2D.Double]",
    "_1": "BOTTOMLANE\r",
    "_2": {

    }
  },
    {
    "$type": "scala.Tuple2[java.lang.String,java.awt.geom.Point2D.Double]",
    "_1": "UPPERLANESECOND_0\r",
    "_2": {

    }
  },
    {
    "$type": "scala.Tuple2[java.lang.String,java.awt.geom.Point2D.Double]",
    "_1": "upperSecondTower_1",
    "_2": {

    }
  },
...
  ]
}

我可以做什么来解决这个问题?

最佳答案

scala-pickling 无法直接 pickle/unpickle Point2D.Double 因为它没有公共(public)字段(x 和 y 值可以通过 getX 访问getY setter/getter )。

Point2D.Double 的可能的 Pickler/Unpickler 是:

object Point2DPickler { 
  import scala.pickling._
  import scala.pickling.Defaults._
  import java.awt.geom.Point2D

  type DoublePoint = java.awt.geom.Point2D.Double
  implicit object Point2DDoublePickle extends Pickler[DoublePoint] with Unpickler[DoublePoint] {
    private val doubleUnpickler = implicitly[Unpickler[Double]]    

    override def tag = FastTypeTag[java.awt.geom.Point2D.Double]

    override def pickle(point: DoublePoint, builder: PBuilder) = {
      builder.beginEntry(point)
      builder.putField("x",
        b => b.hintTag(FastTypeTag.Double).beginEntry(point.getX).endEntry()
      )
      builder.putField("y",
        b => b.hintTag(FastTypeTag.Double).beginEntry(point.getY).endEntry()
      )
      builder.endEntry()
    }

    override def unpickle(tag: String, reader: PReader): DoublePoint = {
      val x = doubleUnpickler.unpickleEntry(reader.readField("x")).asInstanceOf[Double]
      val y = doubleUnpickler.unpickleEntry(reader.readField("y")).asInstanceOf[Double]
      new Point2D.Double(x, y)
    }
  }
}

可以用作:

import scala.pickling.Defaults._
import scala.pickling.json._
import java.awt.geom.Point2D

import Point2DPickler._

val dpoint = new Point2D.Double(1d, 2d)

scala> val json = dpoint.pickle
json: pickling.json.pickleFormat.PickleType =
JSONPickle({
  "$type": "java.awt.geom.Point2D.Double",
  "x": {
    "$type": "scala.Double",
    "value": 1.0
  },
  "y": {
    "$type": "scala.Double",
    "value": 2.0
  }
})

scala> val dpoint2 = json.value.unpickle[java.awt.geom.Point2D.Double]
dpoint2: java.awt.geom.Point2D.Double = Point2D.Double[1.0, 2.0]

关于json - Scala Pickling 似乎不适用于 Point2D.Double,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/31670266/

相关文章:

java - JSONArray 中的唯一 JSONObject

Scala:更有效的过滤列表和创建 future 序列的方法

scala - 什么会导致阶段在 Spark 中重新尝试

scala - 在 APACHE SPARK 中通过 KryoSerializer 和 JavaSerializer 使用 Scala Pickling 序列化

c# - 如何使用 AJAX 将对象数组从 View 传递到 Controller

c# - WCF 动态响应格式

python - 从嵌套 json 列表中展平 Pandas DataFrame

json - Spark Streaming Scala 将不同结构的json组合成一个DataFrame

Scala 酸洗 : Writing a custom pickler/unpickler for nested structures