Bug #6402
serialize-name for write-/read-json is not working
100%
History
#2 Updated by Constantin Asofiei almost 2 years ago
- Priority changed from Normal to High
Ovidiu, please take this one next:
def temp-table tt1 serialize-name "table1" field f1 as int serialize-name "field1". def dataset ds1 serialize-name "dataset1" for tt1. create tt1. tt1.f1 = 10. dataset ds1:write-json("file", "a.json"). dataset ds1:empty-dataset(). dataset ds1:read-json("file", "a.json"). find first tt1. message tt1.f1.
At least the dataset and temp-table at READ-JSON is not using the serialize-name.
#3 Updated by Ovidiu Maxiniuc almost 2 years ago
- Status changed from New to WIP
#4 Updated by Ovidiu Maxiniuc almost 2 years ago
- Status changed from WIP to Review
- % Done changed from 0 to 100
Fixed reading a dataset/table with explicit serialize-name attribute.
Committed revision 13877.
#5 Updated by Constantin Asofiei almost 2 years ago
Ovidiu, the JSON keys and XML elements AFAIK are case-sensitive. In TempTableSchema.columns
, the names are lower-cased. This doesn't seem right.
Please do some tests with 2 fields having Field1
and field1
as serialize-name
and xml-node-name
.
#6 Updated by Ovidiu Maxiniuc almost 2 years ago
Actually, they are not. At least when they are read. But when they are serialized, the exact serialize-name is used. So, for example, if we have the above temp table tt1
and then another tt2
with serialize-name "Table1", they will appear as such in the JSON file. However, when reading, all the records will be stored in @tt1
(assuming they have the same structure so no other error are encountered)!
More than that, validation name for fields, tables and datasets is performed case-insensitive.
Committed to 6129a as revision 13881.
#7 Updated by Constantin Asofiei almost 2 years ago
- Status changed from Review to Test