-
Notifications
You must be signed in to change notification settings - Fork 37
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorrect (?) parsing of content in lstlisting
environment
#90
Comments
Digging a little bit into the source it appears that there is currently no special handling for the Edit: A quick change which causes
I am not sure if this is the correct way to implement this. It does, however, now lead to the correct behavior and has the nice side effect that Note: This does not handle other macros or environments provided by the |
Thanks for the careful analysis! I'll have a closer look at some point soon. |
An update here — the |
Hello, I have though a little bit more about the issue I described in #89. While I definitely understand and respect your reasoning I have come to the conclusion that the issue is not related to
nodelist_to_latex()
at all.Consider the following example:
It is important to note the presence of
tolerant_parsing=False
. Executing this code leads to the following error message:If
tolerant_parsing
is set toTrue
, the script will successfully complete execution and return the following output:Here, it can be seen that a
LatexMathNode
is created inside of thelstlisting
environment. In my opinion, this is not correct behavior since content inside of thelstlisting
environment should be treated as verbatim (one bigLatexCharsNode
(?)). The construction of the unbalancedLatexMathNode
is what eventually causesnodelist_to_latex()
to produce invalid latex code but it is, in my opinion, not the root of the problem. Similarly, in the second example in #89 the sequence\"
gets turned into aLatexMacroNode
.Substituting the
lstlisting
environment for a verbatim environment instead leads to correct results.Again, thank you very much for your time!
The text was updated successfully, but these errors were encountered: