Calling all Oracle users in Ireland.
Tom Kyte will be back in Dublin on Wednesday 19th September for a half day seminar.
The event is is being organised by the Ireland OUG and Oracle.
It will be in the Gibson Hotel beside the Point village.
This is a FREE event for everyone, so share the news and get to see Tom Kyte present for a 4 hours.
As they say spaces are limited, so book your place today. I have.
To register for the event – click here.
In a previous blog post I talked about how you can take a copy of a workflow developed in Oracle Data Miner, and load it into a new schema.
When you data mining project gets to a mature stage and you need to productionalise the data mining process and model updates, you will need to use a different set of tools.
As you gather more and more data and cases, you will be updating/refreshing your models to reflect this new data. The new update data mining model needs to be moved from the development/test environment to the production environment. As with all things in IT we would like to automate this updating of the model in production.
There are a number of database features and packages that we can use to automate the update and it involves the setting up of some scripts on the development/test database and also on the production database.
These steps include:
- Creation of a directory on the development/test database
- Exporting of the updated Data Mining model
- Copying of the exported Data Mining model to the production server
- Removing the existing Data Mining model from production
- Importing of the new Data Mining model.
- Rename the imported mode to the standard name
The DBMS_DATA_MINING PL/SQL package has 2 functions that allow us to export a model and to import a model. These functions are an API to the Oracle Data Pump. The function to export a model is DBMS_DATA_MINING.EXPORT_MODEL and the function to import a model is DBMS_DATA_MINING.IMPORT_MODEL.The parameters to these function are what you would expect use if you were to use Data Pump directly, but have been tailored for the data mining models.
Lets start with listing the models that we have in our development/test schema:
SQL> connect dmuser2/dmuser2
SQL> SELECT model_name FROM user_mining_models;
Create/define the directory on the server where the models will be exported to.
CREATE OR REPLACE DIRECTORY DataMiningDir_Exports AS ‘c:\app\Data_Mining_Exports’;
The schema you are using will need to have the CREATE ANY DIRECTORY privilege.
Now we can export our mode. In this example we are going to export the Decision Tree model (CLAS_DT_1_6)
The function has the following structure
filename IN VARCHAR2,
directory IN VARCHAR2,
model_filter IN VARCHAR2 DEFAULT NULL,
filesize IN VARCHAR2 DEFAULT NULL,
operation IN VARCHAR2 DEFAULT NULL,
remote_link IN VARCHAR2 DEFAULT NULL,
jobname IN VARCHAR2 DEFAULT NULL);
If we wanted to export all the models into a file called Exported_DM_Models, we would run:
If we just wanted to export our Decision Tree model to file Exported_CLASS_DT_Model, we would run:
DBMS_DATA_MINING.EXPORT_MODEL(‘Exported_CLASS_DT_Model’, ‘DataMiningDir’, ‘name in (”CLAS_DT_1_6”)’);
Before you can load the new update data mining model into your production database we need to drop the existing model. Before we do this we need to ensure that this is done when the model is not in use, so it would be advisable to schedule the dropping of the model during a quiet time, like before or after the nightly backups/processes.
Warning : When importing the data mining model, you need to import into a tablespace that has the same name as the tablespace in the development/test database. If the USERS tablespace is used in the development/test database, then the model will be imported into the USERS tablespace in the production database.
Hint : Create a DATAMINING tablespace in your development/test and production databases. This tablespace can be used solely for data mining purposes.
To import the decision tree model we exported previously, we would run
DBMS_DATA_MINING.IMPORT_MODEL(‘Exported_CLASS_DT_Model’, ‘DataMiningDir’, ‘name=’CLAS_DT_1_6”’, ‘IMPORT’, null, null, ‘dmuser2:dmuser3’);
We now have the new updated data mining model loaded into the production database.
The final step before we can start using the new updated model in our production database is to rename the imported model to the standard name that is being used in the production database.
Scheduling of these steps
We can wrap most of this up into stored procedures and have schedule it to run on a semi-regular bases, using the DBMS_JOB function. The following example schedules a procedure that controls the importing, dropping and renaming of the models.
DBMS_JOB.SUBMIT(jobnum.nextval, ‘import_new_data_mining_model’, trunc(sysdate), add_month(trunc(sysdate)+1);
This schedules the the running of the procedure to import the new data mining models, to run immediately and then to run every month.
This year the Annual UKOUG event will be in Birmingham (again) from the 5th to 7th December.
This year there is a slight difference to the usual schedule. On Sunday 4th December there is an Oak Table event, with two parallel tracks. It has all the well known experts presenting at this event.
If I had the time turner from Harry Potter, I would be able to go to all the sessions.
Presenters include Morgens Norgaard, Johnathan Lewis, Frits Hoogland, Martin Widlake, Christian Antognini, Connor McDonald, James Morle and Wolfgang Breitling.
This is an impressive line up and hopefully the UKOUG will run a similar event in 2012.
Check out of the full agenda at
This is one event that I would love to go to but unfortunately I wont be able to make it. I’ll be attending the Annual UKOUG conference alright, and I have already booked my airline tickets. But there are no flights from Dublin that will get me to Birmingham on time. I would need to fly to Birmingham on the Saturday, involving another hotel night and another night away from the family.
The best I’m hoping for is to get to the ICC in time for the Panel Session on Performance and High Availability.
Depending on weather and travel delays I might even miss this last session. If I do, I can always meet up with everyone in the pub on the Sunday evening for a chat.
Maybe next year.