Tags: class, code, database, include, jar, java, jdbc, mysql, nls_charset12, nls_charset12jar, oracle, oracle9i, path, running, sql, talks
When running java code that talks to oracle9i database why would I want to include NLS_CHARSET12.jar in my class path? When you go to oracle jdbc download NLS_CHARSET12.jar is one of the jars tah is included for download together with classes12.jar.. I tried removing NLS_CHARSET12.jar from my classpath and I was still successful in executing a query and retrieving data from database. In what cases are the classes in this jar invoked. I am trying to understand whether I can remove this jar completely from the classpass.
thank you very much
Leave a comment...
- 10 Comments
- The JDBC Developer's Guide discusses the particular cases where you need this extra ZIP file-
Basically, whenever you have CHAR or VARCHAR2 columns in collection or object data retrieved from the database that is not ASCII, UTF-8, ISO-LATIN-1, or WE8DEC encoded, you need to include the additional file.
Distributed Database Consulting, Inc.#1; Fri, 22 Feb 2008 03:00:00 GMT
- Does this apply to, not only database side, but situations when the "app host" (client or mid-tier) character set is different?
E.g. NLS_LANG set to .WE8MSWIN1252, which is common on Win PCs, I guess.#2; Fri, 22 Feb 2008 03:01:00 GMT
- I assume that these are needed only when the client character set is something other than one of the listed character sets, regardless of the database character set. In other words, if your NLS_LANG is UTF-8, I believe you could avoid using these ZIP files.
The JDBC Programmer's Reference indicates that the JDBC driver does some slightly non-standard things with the NLS_LANG settings, since it always wants to use UTF-16 internally. If you specify an NLS_LANG other than US7ASCII or WE8ISO8859P1, the JDBC driver imposes UTF-8 as the client character set.
From that, my assumption is that the driver needs the additional ZIP files so that it can convert between UTF-8, which it sends over the wire, and whatever other client character set is present.
I haven't done the legwork necessary to verify that my assumptions and interpretations of this bit of documentation is actually correct, but it's the only interpretation I can come up with that seems to make sense.
Distributed Database Consulting, Inc.#3; Fri, 22 Feb 2008 03:02:00 GMT
the explanation that your link points to talks about the OCI driver and that this driver uses nls_charset12.jar to do the character conversion. I use oracle thin driver and I don't see an explanation on how and when the oracle thin driver uses this jar? Is it used the same way by both drivers?#4; Fri, 22 Feb 2008 03:03:00 GMT
- Never mind Justin, my bad. I see the explanation for the thin driver as well#5; Fri, 22 Feb 2008 03:04:00 GMT
how do I check what character encoding is currently used by the db#6; Fri, 22 Feb 2008 03:05:00 GMT
- A database will have two different character sets. The NLS_CHARACTERSET parameter specifies the character set for CHAR, VARCHAR2, and CLOB columns. The NLS_NCHAR_CHARACTERSET parameter specifies the character set for NCHAR, NVARCHAR2, and NCLOB colums.
WHERE parameter IN ('NLS_CHARACTERSET', 'NLS_NCHAR_CHARACTERSET' )
Distributed Database Consulting, Inc.#7; Fri, 22 Feb 2008 03:06:00 GMT
I ran the query and for NLS_CHARACTERSET I got WE8MSWIN1252. The reaason I wanted to know the charset is to see if it's one of the four charsets that you listed earlier that are supported by classes in classes12.jar. And according to the link provied by you if it's not one of the following:
then I would need to include NLS_CHARSET12.jar for the appropriate conversion to take place. However, if I don't include the NLS_CHARSET12.jar the varchars and chars are still retrieved appropriately. Can you advice?
Am I not looking at the right property setting-?#8; Fri, 22 Feb 2008 03:07:00 GMT
- My interpretation of the documentation, which is a bit unclear here, leads me to believe that the determining factor in whether the NLS_CHARSET12.jar file is used is the character set on the client machine, rather than the character set on the server. The client's character set can be set in the registry or as an environment variable (NLS_LANG). Otherwise, it will default to the server's value.
The Windows-1252 and ISO-8859-1 (ISO-Latin-1) character set are very, very similiar. They may well be identical, I'm not sure. I would wager that you'd only have problems dealing with characters which differ in the two encoding. It may be that no characters differ, in which case, you'd be able to omit the JAR file, unless and until the client's NLS_LANG was changed to something other than one of the "acceptable" character sets.
Distributed Database Consulting, Inc.#9; Fri, 22 Feb 2008 03:08:00 GMT
- Yes, the doc is not very clear on this.
Nearly all JDBC character set conversions for CHAR, VARCHAR and NCHAR etc. happen on the server side, (i.e. using the NLSRTL engine on the database side), so typically no client side character set library (NLS_CHARSET12.jar) is needed.
Client character set conversions are required for Oracle OBJECTS and COLLECTION TYPES only. Now if your db character set and the client character set are either UTF8, WE8DEC, US7ASCII or ISO-Latin-1, then NLS_CHARSET12.jar is not needed also; because they can be supported in CLASSES12.jar .
The only time you need NLS_CHARSET12.jar is when you are using OBJECTS or COLLECTION TYPES, and your db and/or client character set are not one of the 4 previously listed character sets.
Nat#10; Fri, 22 Feb 2008 03:09:00 GMT