Skip to content
Home » Oracle Database » How to Resolve ORA-12899: value too large for column

How to Resolve ORA-12899: value too large for column

ORA-12899

ORA-12899 means that the value of being inserted or updated data is larger than column's allowable length that is defined in the table.

What will you do for larger values? Make the column wider? Yes, that's correct answer. The formal solution to ORA-12899 is to make the column wider to accommodate potential larger values..

In this post, I'll talk about a special case, it's about NLS_LENGTH_SEMANTICS. Let's see how I found ORA-12899 when I migrated some tables from one database (UTF8) to another (AL32UTF8).

SQL> INSERT INTO "ERPAPP"."FIN_NORTH_HIST"@ORA12C SELECT * FROM "ERPAPP"."FIN_NORTH_HIST";
INSERT INTO "ERPAPP"."FIN_NORTH_HIST"@ORA12C SELECT * FROM "ERPAPP"."FIN_NORTH_HIST"
*
ERROR at line 1:
ORA-12899: value too large for column "ERPAPP"."FIN_NORTH_HIST"."MAT"
(actual: 3, maximum: 1)
ORA-02063: preceding line from ORA12C

Check Column Definition

Let's see the definition of said column in the target database.

SQL> column data_type format a10;
SQL> select data_type, char_length, char_used from dba_tab_cols where owner = 'ERPAPP' and table_name = 'FIN_NORTH_HIST' and column_name = 'MAT';

DATA_TYPE  CHAR_LENGTH C
---------- ----------- -
CHAR                 1 B

In which, value B in column CHAR_USED means BYTE in DBA_TAB_COLS. That is to say, the semantics of this column is BYTE. By the way, I think you'd better to know the difference between BYTE and CHAR for NLS_LENGTH_SEMANTICS before reading further.

Solution to ORA-12899

Since BYTE semantics within same length is not safe for data migration, so changed the semantics of this column from BYTE to CHAR for databases with multi-byte language. For each character, UTF-8 takes 1 to 4 bytes to store.

SQL> alter table "ERPAPP"."FIN_NORTH_HIST" modify(MAT char(1 char));

Table altered.

Or you can just increase the column width and stay with BYTE.

SQL> alter table "ERPAPP"."FIN_NORTH_HIST" modify(MAT char(4 byte));

Either way should work.

Check Column Definition

Let's Check the definition of this column again.

SQL> select data_type, char_length, char_used from dba_tab_cols where owner = 'ERPAPP' and table_name = 'FIN_NORTH_HIST' and column_name = 'MAT';

DATA_TYPE  CHAR_LENGTH C
---------- ----------- -
CHAR                 1 C

OK! Now the semantics of this column is CHAR, let's do it again.

SQL> INSERT INTO "ERPAPP"."FIN_NORTH_HIST"@ORA12C SELECT * FROM "ERPAPP"."FIN_NORTH_HIST";

1468224039 rows created.

SQL> COMMIT;

Commit complete.

Good! No ORA-12899.

There's still a question in my mind. Why would this happen? Since AL32UTF8 is in fact a superset of UTF8, it should have no problem on column width during data migration. I suspect that the column was encoded by non-UTF8-based character set in the first place.

Leave a Reply

Your email address will not be published. Required fields are marked *