The past couple weeks, I have been preparing for the FIS/HR upgrade that is coming in November. Job Submission was not working correctly, and we got it working like it does in production. I need to get together with Soukup before the upgrade to make sure I am aware of everything that I will need to do.
Bruce and I talked a bit and found a way to record the distance someone lives from our University using google maps api. It will be interesting to get that measurement and see if it brings useful information. Not sure if Google will allow us to hit it that many times to get the latitude and longitude.
Monday, October 15, 2012
Wednesday, July 25, 2012
We have Eric's request about set with the data warehouse. Bruce is trying to adjust the report in Cognos to show the way that Eric would like. I need to go through and really polish the ETL for the whole data warehouse:
1. Add in some of the transformations to the ETL (Some of them were done on the fly for testing)
2. Redo student to enrollment, so it has the lowest level of detail?
3. Add a mapping in student/enrollment for the one that we deleted. (to not load 2 things in one mapping)
4. Possibly rename some things and have a better process for ETL development
a. When to bind dimensions/cubes
b. When to deploy, just to see if there is a faster way.
5. Build easier to get to data (if student/enrollment is redone)
6. Revisit the demographic for student.
7. Revisit the academics for student.
I have also been setting up some cron jobs on wouupg and getting up to speed with what needs to be done.
1. Add in some of the transformations to the ETL (Some of them were done on the fly for testing)
2. Redo student to enrollment, so it has the lowest level of detail?
3. Add a mapping in student/enrollment for the one that we deleted. (to not load 2 things in one mapping)
4. Possibly rename some things and have a better process for ETL development
a. When to bind dimensions/cubes
b. When to deploy, just to see if there is a faster way.
5. Build easier to get to data (if student/enrollment is redone)
6. Revisit the demographic for student.
7. Revisit the academics for student.
I have also been setting up some cron jobs on wouupg and getting up to speed with what needs to be done.
Thursday, July 12, 2012
Data Warehouse
This week we have been working on answering Eric's question with the data warehouse. I feel like the planning made before hand was good but I can now see where it can be polished a little more:
1) First thing is ALWAYS add time to research and understand how to answer the question in production and how things are linked. When we planned this, we just thought of the work to make the dimensional model.
2) Time to Build and load ETL + time to link in Framework Manager + time to make reports + research time + testing time = total time. Keep in mind while testing, this might mean going back through the whole process (hopefully not, but it happens).
While doing this I was able to squeeze in fixing the leave in production. Banner figures leave a little different than how we used to figure leave so we are going to match how Banner works with leave rather than the other way.
1) First thing is ALWAYS add time to research and understand how to answer the question in production and how things are linked. When we planned this, we just thought of the work to make the dimensional model.
2) Time to Build and load ETL + time to link in Framework Manager + time to make reports + research time + testing time = total time. Keep in mind while testing, this might mean going back through the whole process (hopefully not, but it happens).
While doing this I was able to squeeze in fixing the leave in production. Banner figures leave a little different than how we used to figure leave so we are going to match how Banner works with leave rather than the other way.
Friday, June 1, 2012
Busy week
This week has been busy. I have spent a bit of time trying to get FUPLOAD to a point where Eric and Michele can use it all on there own. There was a problem with DOS to Unix formatting, but I was able to solve it with some Visual Basic behind the spreadsheet. Apparently, changing Microsoft does not like to list out their file format codes, but changing the file format to 'xlTextMac' from 'xlText' solved the problem because Mac formatting is the same as Unix, which makes me happy :)
I am starting on more of the data warehouse. I am planning the ETL and smoothing out what each dimension should look like. It seems all the FOAPAL elements are hammered out. Just need to figure out how to place an entity as a vendor, student, staff, faculty and so on. Currently, I am thinking of grabbing pidms and looking them up and seeing what they are and place them according, so, I think I will make a function that takes a pidm and inserts into a temp table.
I am starting on more of the data warehouse. I am planning the ETL and smoothing out what each dimension should look like. It seems all the FOAPAL elements are hammered out. Just need to figure out how to place an entity as a vendor, student, staff, faculty and so on. Currently, I am thinking of grabbing pidms and looking them up and seeing what they are and place them according, so, I think I will make a function that takes a pidm and inserts into a temp table.
Thursday, February 23, 2012
Restoring table to a certain time
I recently made a mistake while doing a non standard update in banner and had to call Jim Rouff to restore the table to a previous point in time. I have been reading some of a book that Christina has and read about FLASHBACK. Now, if I make that same issue, I can resolve it myself. Example
FLASHBACK TABLE <table name> to timestamp systimestamp - interval '0 00:02:00' day to second;
This will restore the table to what it was 2 minutes ago, so you can see how the interval works and where seconds, hours, and (I think) days.
You may get an error/warning that the row movement is not enabled so then do this:
ALTER <table name> ENABLE ROW MOVEMENT;
then after the flashback you can disable row movement again with:
ALTER <table name> DISABLE ROW MOVEMENT;
No need to call Jim again :)
NOTE: If the structure of table was changed (like dropped column) then the FLASHBACK won't work. But, if you drop a table you can use:
FLASHBACK <table name> TO BEFORE DROP;
A dropped table is restored from the oracle recycling bin, so, if it is purged, it is lost. Restoring to a point in time is different. Oracle Flashback features use the Automatic Undo Management (AUM) system to obtain metadata and historical data for transactions. They rely on undo data, which are records of the effects of individual transactions. For example, if a user executes an
If you want to just select from a point in time you can do this:
select * from <table name> as of timestamp systimestamp - interval '0 06:00:00' day to second;
That will select from the table as of 6 hours earlier.
This will restore the table to what it was 2 minutes ago, so you can see how the interval works and where seconds, hours, and (I think) days.
You may get an error/warning that the row movement is not enabled so then do this:
ALTER <table name> ENABLE ROW MOVEMENT;
then after the flashback you can disable row movement again with:
ALTER <table name> DISABLE ROW MOVEMENT;
No need to call Jim again :)
NOTE: If the structure of table was changed (like dropped column) then the FLASHBACK won't work. But, if you drop a table you can use:
FLASHBACK <table name> TO BEFORE DROP;
A dropped table is restored from the oracle recycling bin, so, if it is purged, it is lost. Restoring to a point in time is different. Oracle Flashback features use the Automatic Undo Management (AUM) system to obtain metadata and historical data for transactions. They rely on undo data, which are records of the effects of individual transactions. For example, if a user executes an
UPDATE
statement to change a salary from 1000 to 1100, then Oracle Database stores the value 1000 in the undo data.If you want to just select from a point in time you can do this:
select * from <table name> as of timestamp systimestamp - interval '0 06:00:00' day to second;
That will select from the table as of 6 hours earlier.
AJAX Cross domain with PHP and PL/SQL using JSONP
Currently, if you try to use a regular ajax call cross domain there is a same origin policy that will not allow you to do it. One way around this is using JSONP (JavaScript Object Notation with Padding). The name makes no sense and trying to think what the padding means is useless. But this will allow cross domain communication and that is the important part. On my P drive I made a PHP page like this:
<?php
header("content-type: text/javascript");
if(isset($_GET['name']) && isset($_GET['callback']))
{
$obj->name = $_GET['name'];
$obj->message = "Hello " . $obj->name;
echo $_GET['callback']. '(' . json_encode($obj) . ');';
}
?>
The JSON is $obj, and the encoding will format it to a Javascript object notation to allow accessing $obj name and message.
Next, the PL/SQL page:
create or replace procedure ajax_test is
begin
htp.prn('
<!DOCTYPE html>
<html lang="en">
<head>
<title>JQuery JSONP</title>
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.5.2/jquery.min.js"></script>
<script>
$(document).ready(function(){
$("#useJSONP").click(function(){
$.ajax({
url: "http://www.wou.edu/~braken/test_stuff.php",
data: {name: "Chad"},
dataType: "jsonp",
jsonp: "callback",
jsonpCallback: "jsonpCallback",
success: function(){
alert("success");
}
});
});
});
function jsonpCallback(data){
$("#jsonpResult").text(data.message);
}
</script>
</head>
<body>
<input type="button" id="useJSONP" value="Use JSONP"></input><br /><br />
<span id="jsonpResult"></span>
</body>
</html>
');
end ajax_test;
With this, as soon as you hit the button, it will call the http page and send it the data "Chad" then the PHP page will get that name and make a $obj->name and message, and then will encode it in JSON. Then it will call the function named in jsonpCallback in the ajax call which will take the data and display it. Notice, the message, since it was encoded in JSON, can be accessed as data.message which is the $obj from the PHP page.
<?php
header("content-type: text/javascript");
if(isset($_GET['name']) && isset($_GET['callback']))
{
$obj->name = $_GET['name'];
$obj->message = "Hello " . $obj->name;
echo $_GET['callback']. '(' . json_encode($obj) . ');';
}
?>
The JSON is $obj, and the encoding will format it to a Javascript object notation to allow accessing $obj name and message.
Next, the PL/SQL page:
create or replace procedure ajax_test is
begin
htp.prn('
<!DOCTYPE html>
<html lang="en">
<head>
<title>JQuery JSONP</title>
<script src="http://ajax.googleapis.com/ajax/libs/jquery/1.5.2/jquery.min.js"></script>
<script>
$(document).ready(function(){
$("#useJSONP").click(function(){
$.ajax({
url: "http://www.wou.edu/~braken/test_stuff.php",
data: {name: "Chad"},
dataType: "jsonp",
jsonp: "callback",
jsonpCallback: "jsonpCallback",
success: function(){
alert("success");
}
});
});
});
function jsonpCallback(data){
$("#jsonpResult").text(data.message);
}
</script>
</head>
<body>
<input type="button" id="useJSONP" value="Use JSONP"></input><br /><br />
<span id="jsonpResult"></span>
</body>
</html>
');
end ajax_test;
With this, as soon as you hit the button, it will call the http page and send it the data "Chad" then the PHP page will get that name and make a $obj->name and message, and then will encode it in JSON. Then it will call the function named in jsonpCallback in the ajax call which will take the data and display it. Notice, the message, since it was encoded in JSON, can be accessed as data.message which is the $obj from the PHP page.
Thursday, January 19, 2012
Stuff and things
I've noticed that Soukup is taking on a lot of stuff with Student and FIS/HR. I really need to get to the point where he can worry about just Student. I hope it doesn't take a long time for this to happen.
This week we have been working on the leave accrual. Mike is taking Wendler's old script and rewriting it. We were also talking about putting it into job submission that way we no longer have to worry about it. We can teach them how to do it then we can wash our hands clean from making mistakes.
This week we have been working on the leave accrual. Mike is taking Wendler's old script and rewriting it. We were also talking about putting it into job submission that way we no longer have to worry about it. We can teach them how to do it then we can wash our hands clean from making mistakes.
Subscribe to:
Posts (Atom)