Recent security research claims that visibly harmless conduct of JSON parsers can implement an assortment of security vulnerabilities with respect to the performance of data interpretation across numerous parsers.
Security researchers conducting the research stated observations that among the 49 JASON parsers that were researched, every language had at least one parser that demonstrated a “ risky interoperability behavior”.
Attack scenarios on these parsers have been demonstrated with the collaboration of Docler Compose labs by exploiting inconsistent duplicate key precedence, key collision through character truncation and comments, JSON serialization quirks, float and integer representation, and permissive parsing.
In this case, single parsers exhibiting this conduct are seemingly harmless, barring from segmentation faults. This prevents the single parsers from being categorized as typical vulnerabilities.
However, when a myriad of parsers are considered in this scenario, they lead to severe vulnerabilities that can abuse the business logic, injection as well as type juggling abilities, among various other issues.
Dissecting the cause of developers switching to third-party parsers rather than the standard library ones, experts have stated that the traditional parsers, even though more compliant, are frequently speed-deficient.
This is especially concerning for the architectures in microservice forms that rely on speedy methodologies.
Inter-operability of the JSON parsers:
The increasing scenario of growing complexities of the interoperability in today’s multi-lingual microservices framework results in erratic deployments via ambiguous specifications of the standard parsers.
Applications implementing such frameworks commonly depend on the varied characteristics of individual JSON parsers.
JSON parsers relative to the open-ended conduct:
Even if the best-case scenario is considered, parsers are ineluctably trivial, involuntary variations of peculiars.
But this can implicate that JSON specifications are self-divergent which is not the argument.
Evidently, the official JSON RFC control is open-ended for some topics such as the ability to handle duplicate keys and represent numbers.
Even though a majority of JSON parser users remain unaware of these precautions, the control is still followed by disclaimers about interoperability.
If such is the situation, then the assumption becomes even more misconceived as limiting behavior to definitive results advances interoperability and helps in detecting vulnerabilities to enhance the software.
As aforementioned, since microservice frameworks require increasing complexity, this outcome may be worth considering even if disrupting by defining previously undefined behavior could result in displacement.
Risky security threats:
To reduce the security risks, experts have recommended that parser maintainers should create the potentially harmful error on duplicate keys and avoid character truncation instead of substituting invalid Unicode with proxy characters, among other things.
These vulnerabilities can be pretty tough to notice internally because of the subtle nature of the attacks. Hence it is important that source code accessors look for JSON parses with known characteristics, duplicate keys and use the lab suggestions mentioned in the README to reduce the security threats.
Trivializing threats i.e JSON parsers:
Implications of the JSON format tend to take its intelligibility for granted and this is just one of the examples where JSON parsers are hardly considered as security threats by users.
It is only when assumptions start disrupting due to variable execution that business logic validation abilities also start rupturing.
Security threats like such shed light on the matter of the significance of research at primitive levels that can have consequential and severe implications.