| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
| |
Also added a standalone formatter test, for confidence.
Have validated that undoing the change in 835fb947 breaks the tests
(i.e. we are still testing that the change is required).
|
|
|
|
|
|
| |
(And likewise ignore the prefix in unpack.)
Fixes issue #1459.
|
|
|
|
| |
JSON tests fail, as we're not using OriginalNameAttribute yet.
|
| |
|
|\
| |
| | |
Introduce ICustomDiagnosticMessage to allow for custom string formatting
|
| |
| |
| |
| | |
This fixes issue #933, effectively.
|
| |
| |
| |
| |
| |
| | |
"valueField": null
is parsed appropriately, i.e. that it remembers that the field is set.
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
| | |
|
|/
|
|
|
| |
- Spot an Any without a type URL
- In the conformance test runner, catch exceptions due to generally-invalid JSON
|
|\
| |
| | |
Ensure all formatted well-known-type values are valid JSON
|
| |
| |
| |
| |
| |
| |
| | |
This involves quoting timestamp/duration/field-mask values, even when they're not in fields.
It's better for consistency.
Fixes issue #1097.
|
|/
|
|
|
|
| |
- Tighten up on Infinity/NaN handling in terms of whitespace handling (and test casing)
- Validate that values are genuinely integers when they've been parsed from a JSON number (ignoring the fact that 1.0000000000000000001 == 1 as a double...)
- Allow exponents and decimal points in string representations
|
|
|
|
| |
This required a rework of the tokenizer to allow for a "replaying" tokenizer, basically in case the @type value comes after the data itself. This rework is nice in some ways (all the pushback and object depth logic in one place) but is a little fragile in terms of token push-back when using the replay tokenizer. It'll be fine for the scenario we need it for, but we should be careful...
|
| |
|
|\
| |
| | |
Add recursion limit handling to JSON parsing.
|
| |
| |
| |
| | |
Fixes issue #932.
|
|/
|
|
|
|
| |
This is only thrown directly by JsonTokenizer, but surfaces from JsonParser as well. I've added doc comments to hopefully make everything clear.
The exception is actually thrown by the reader within JsonTokenizer, in anticipation of keeping track of the location within the document, but that change is not within this PR.
|
|
This includes all the well-known types except Any.
Some aspects are likely to require further work when the details of the JSON parsing expectations are hammered out in more detail. Some of these have "ignored" tests already.
Note that the choice *not* to use Json.NET was made for two reasons:
- Going from 0 dependencies to 1 dependency is a big hit, and there's not much benefit here
- Json.NET parses more leniently than we'd want; accommodating that would be nearly as much work as writing the tokenizer
This only really affects the JsonTokenizer, which could be replaced by Json.NET. The JsonParser code would be about the same length with Json.NET... but I wouldn't be as confident in it.
|