Vikunja itself is always fully capable of handling utf-8 characters. However, your database might be not. Vikunja itself will work just fine until you want to use non-latin characters in your tasks/projects/etc.
On this page, you will find information about how to fully ensure non-latin characters like aüäß or emojis work with your installation.
Postgresql & SQLite #
Postgresql and SQLite should handle utf-8 just fine - If you discover any issues nonetheless, please drop us a message.
MySQL #
MySQL is not able to handle utf-8 by default. To fix this, follow the steps below.
To find out if your db supports utf-8, run the following in a shell or similar, assuming the database you're using for Vikunja is called vikunja
:
SELECT default_character_set_name FROM information_schema.SCHEMATA WHERE schema_name = 'vikunja';
This will get you a result like the following:
+----------------------------+
| default_character_set_name |
+----------------------------+
| latin1 |
+----------------------------+
1 row in set (0.001 sec)
The charset latin1
means the db is encoded in the latin1
encoding which does not support utf-8 characters.
(The following guide is based on this thread from stackoverflow)
0. Backup your database #
Before attempting any conversion, please back up your database.
1. Create a pre-conversion script #
Copy the following sql statements in a file called preAlterTables.sql
and replace all occurrences of vikunja
with the name of your database:
use information_schema;
SELECT concat("ALTER DATABASE `",table_schema,"` CHARACTER SET = utf8mb4 COLLATE = utf8mb4_unicode_ci;") as _sql
FROM `TABLES` where table_schema like 'vikunja' and TABLE_TYPE='BASE TABLE' group by table_schema;
SELECT concat("ALTER TABLE `",table_schema,"`.`",table_name,"` CONVERT TO CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;") as _sql
FROM `TABLES` where table_schema like 'vikunja' and TABLE_TYPE='BASE TABLE' group by table_schema, table_name;
SELECT concat("ALTER TABLE `",table_schema,"`.`",table_name, "` CHANGE `",column_name,"` `",column_name,"` ",data_type,"(",character_maximum_length,") CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci",IF(is_nullable="YES"," NULL"," NOT NULL"),";") as _sql
FROM `COLUMNS` where table_schema like 'vikunja' and data_type in ('varchar','char');
SELECT concat("ALTER TABLE `",table_schema,"`.`",table_name, "` CHANGE `",column_name,"` `",column_name,"` ",data_type," CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci",IF(is_nullable="YES"," NULL"," NOT NULL"),";") as _sql
FROM `COLUMNS` where table_schema like 'vikunja' and data_type in ('text','tinytext','mediumtext','longtext');
2. Run the pre-conversion script #
Running this will create the actual migration script for your particular database structure and save it in a file called alterTables.sql
:
mysql -uroot < preAlterTables.sql | egrep '^ALTER' > alterTables.sql
3. Convert the database #
At this point converting is just a matter of executing the previously generated sql script:
mysql -uroot < alterTables.sql
4. Verify it was successfully converted #
If everything worked as intended, your db collation should now look like this:
SELECT default_character_set_name FROM information_schema.SCHEMATA WHERE schema_name = 'vikunja';
Should get you:
+----------------------------+
| default_character_set_name |
+----------------------------+
| utf8mb4 |
+----------------------------+
1 row in set (0.001 sec)