Then, just below change the default settings (just in case). Only the ENGINE has been changed, as all other settings are loaded from variables:
Code Block |
---|
|
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ.get('DB_NAME'),
'USER': os.environ.get('DB_USER'),
'PASSWORD': os.environ.get('DB_PASSWORD'),
'HOST': os.environ.get('DB_HOST'),
'PORT': os.environ.get('DB_PORT'),
'OPTIONS': DATABASE_OPTIONS}
} |
At this point you could just use the SQL dump of all tables and views and functions that can be found in the data migration GitHub repo and be done with everything. If, instead, you want to get Django to create and manage tables, continue with the guide.
3. Manage (almost) all tables with Django
In every module except api_fhir (openimis-be-api_fhir_py) go through each table and in the subclass meta set managed to True and add all the missing colums to the tables (which have been commented out). The FHIR API has been exluded as another project is currently updating that module. There are only two exceptions: In the core module core_User_Groups (the class is named UserGroup) should still have managed set to False and in the claim module the table claim_ClaimAttachmentsCountView (the class is named ClaimAttachmentCount and defines a view, not a table) should also have managed set to false.
4. Create this table in the new databse
In order to get Django to make the migration, the table auth_group must exist in Your new database and have the two users user and spiderman. I don’t know why, but Django refuses to make the migrations unless this table exists. Just execute the following SQL query in your postgres database
Code Block |
---|
|
CREATE TABLE public.auth_group
(
id integer NOT NULL DEFAULT nextval('auth_group_id_seq'::regclass),
name character varying(80) COLLATE pg_catalog."default" NOT NULL,
CONSTRAINT auth_group_pkey PRIMARY KEY (id),
CONSTRAINT auth_group_name_key UNIQUE (name)
)
WITH (
OIDS = FALSE
)
TABLESPACE pg_default;
ALTER TABLE public.auth_group
OWNER to postgres;
CREATE INDEX auth_group_name_a6ea08ec_like
ON public.auth_group USING btree
(name COLLATE pg_catalog."default" varchar_pattern_ops ASC NULLS LAST)
TABLESPACE pg_default;
INSERT INTO public.auth_group (id, name) VALUES (1, 'spiderman');
INSERT INTO public.auth_group (id, name) VALUES (2, 'user'); |
5. Make the migration
Now make the migration. In Your <openIMIS base path>\openimis-be_py\openIMIS\ directory, run the following command from a command line:
Code Block |
---|
|
python manage.py makemigrations |
Depending on Your system setup You may need to run python3 instead of python.
You can keep this command line window open as we need to run another command in the same directory later.
6. Delete the table created in step 3
Now, drop the table we have created (auth_group) so that Django actually applies the migration. Yes, Django wants this table to exist to make the migrations but needs it to be gone when applying the migration.
In your postgres Database, run the following query:
Code Block |
---|
DROP TABLE public.auth_group; |
7. Adjust Migration Dependency
Django wants to run the migration for the policy module before the product module. This will lead to errors. In Your product module’s directory (where this module’s models.py file is located) there is a migrations folder. And in there is a file called 0001_initial.py. In that file Add the following lines at the top of the Migration class, just below initial = True:
Code Block |
---|
|
run_before = [
('policy', '0001_initial'),
] |
A further note on this: Originally I had not realised that the commented out columns were still being used and I had made the initial migration without them (and updated the migration later). This step may need adjusting when all columns are present from the start but thankfully Django is very clear with its error messages and You should have no trouble finding out what module should run before another module. The basic idea remains the same and is easily adapted.
8. Apply the Migration
Now You can apply the migration. Go back to Your command line window from step 5. In the same directory run the following command:
Code Block |
---|
python manage.py migrate |
The basic database structure of the openIMIS database should now exist in Your PostgreSQL database.
9. Add Missing Tables
Not all 84 database tables have Django definitions as of yet. The missing 20 tables can be added by running an sql script. The scripts are getting very long now, so I just added them as an attachment:
View file |
---|
name | 01_missing_tables.sql |
---|
|
Now that all tables are present, you can already migrate all the data from a SQL Server database over (e.g. the demo database). You can use the migration tool we created.
10 Views and Functions
Adding the 34 views and 23 functions of openIMIS is as simple as executing these prepared SQL queries. But first, You need to add a few SQL Server system functions that PostgreSQL does not have. I have prepared a file with functionality for the Year(), Quarter(), Month(), DateDiff(), DateName(), EOMonth(), and DateAdd() functions. Those functions have been used so often that it made sense to put the same functionality into PostgreSQL. But before writing new code that uses these functions, look at the script an notice that the syntax is slightly different (for performance reasons). DateDiff(“year”, date) for example is now DateDiff_Year(date). Anyway, here’s the script:
Once this is done, you can add the views first, and the functions second (Some functions try to read from views so we have to respect the dependency). Here’s the scripts:
If you really want to, you can also add the three stored procedures that I managed to migrate in the time I had available. Those are the following three stored procedures: uspSSRSPolicyStatus, uspPolicyValue, uspGetPolicyPeriod. Here’s the script:
Your database is now up and running.