Is there such a thing as the coefficient of XML data or something like that.
What I mean by this concept: as XML or JSON before parsing (during transmission) it's still a line that has attributes and their values, the efficiency of something like the ratio of the length (size in bytes) of rows of these data to the length/size of the entire document/rows.
Just from the file exchange some "mega advanced" CMS eyes begin to bleed in terms of this efficiency.
It all depends on who and how to arrange the data. Since the attributes name a Json/xml developer invents itself, then the proportions may be completely different scale. These formats were created to work with them man, unlike any dbf, etc.
In addition, you can "take" it and package XML, as it did with Microsoft formats docx and xlsx. (This is off topic)
In relation to the proportions of the clearer is such a thing as url-encode - convert binary data into ascii. There is obviously an increase in the volume percent of 30 or more. The comparison with JSON/XML is not very suitable.