J2V8 Supports 16bit Characters

July 26, 2016 | 1 min Read

When J2V8 was first released, all Java characters passed between Java and V8 were converted to 8bit C-Style strings. For many applications this was just fine, but if your JavaScript contained 16 bit Unicode characters, then you were hosed.

https://twitter.com/waynebeaton/status/757609271496286208

With J2V8 4 this has been fixed. All strings are now referenced as uint16_t in C++. Using the JNI API we were able to get the 2 byte string using:

const uint16_t* unicodeString = env->GetStringChars(string, NULL);

and create the V8 String object using String::NewFromTwoByte().

Now you can execute JavaScript, or access JS properties, that contain 16 bit characters such as:

For more J2V8 Tips and Tricks, follow me on Twitter.

Stay Updated with Our Latest Articles

Want to ensure you get notifications for all our new blog posts? Follow us on LinkedIn and turn on notifications:

  1. Go to the EclipseSource LinkedIn page and click "Follow"
  2. Click the bell icon in the top right corner of our page
  3. Select "All posts" instead of the default setting
Follow EclipseSource on LinkedIn
Ian Bull

Ian Bull

Ian is an Eclipse committer and EclipseSource Distinguished Engineer with a passion for developer productivity.

He leads the J2V8 project and has served on several …