Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Encoding breaks when writing large documents #16

Open
Tomas-Kraus opened this issue Aug 21, 2008 · 4 comments
Open

Encoding breaks when writing large documents #16

Tomas-Kraus opened this issue Aug 21, 2008 · 4 comments

Comments

@Tomas-Kraus
Copy link
Member

The FastInfoset implementation appears to fail when writing many nodes. The
error is:

Exception in thread "main" javax.xml.stream.XMLStreamException:
java.io.IOException: Integer > 1,048,576
at
com.sun.xml.fastinfoset.stax.StAXDocumentSerializer.encodeTerminationAndCurrentElement
(StAXDocumentSerializer.java:630)
at com.sun.xml.fastinfoset.stax.StAXDocumentSerializer.writeEndElement
(StAXDocumentSerializer.java:271)

The same occurs with SAX.

It is apparently because the indexing tables are blindly filled with more than
a million entries, beyond what the spec allows.

The simple but suboptimal fix is to stop inserting into the indexing tables
when they hit a million elements.

Environment

Operating System: All
Platform: All

Affected Versions

[current]

@Tomas-Kraus
Copy link
Member Author

@glassfishrobot Commented
Reported by rragno

@Tomas-Kraus
Copy link
Member Author

@glassfishrobot Commented
Was assigned to oleksiys

@Tomas-Kraus
Copy link
Member Author

@glassfishrobot Commented
This issue was imported from java.net JIRA FI-16

@Tomas-Kraus
Copy link
Member Author

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants