creating physical data service where Oracle user is not the owner of schema - Data Service Integrator

Env: ODSI 10gR3, Workshop 10gR3, Oracle DB 10gR3
We have some tables in Oracle that are owned by a schema called "OWNER_SCHEMA". However, we are constrained by our DBA to login as the schema called "USER_SCHEMA". USER_SCHEMA has been given select, insert, update, delete privileges on the OWNER_SCHEMA tables. In addition synonyms have been created in USER_SCHEMA for each of the tables in OWNER_SCHEMA.
Using the ODSI wizard in Workshop to create a physical data services for the OWNER_SCHEMA tables
1. we specify a JDBC data source corresponding to "USER_SCHEMA".
2. we select a table belonging to OWNER_SCHEMA, e.g. OWNER_SCHEMA.TBL_ONE
However, the physical data service created does not have the column names corresponding to TBL_ONE. Any ideas what could be going on here? 

If the physical data service does not contain the columns from TBL_ONE, what does it contain? Nothing? Columns with other names? Some of the columns? Can you post the output of Describe USER_SCHEMA.TBL_ONE and the generated data service?
ODSI uses jdbc metadata calls to introspect database tables. It displays what the metadata calls return - if there are columns with types that are not supported ODSI will omit them. Other than that - my guess would be that your jdbc data source is not aimed where you think - possibly a different database instance. 

Sorry, my bad. In Workshop, I had selected the data objects under "USER_SCHEMA" instead of "OWNER_SCHEMA". Correcting this fixed the problem.


loading data from access when database structure is different

I have access database.I want to copy data from access to oracle tables.Some tables in oracle have different structure than Access.e.g there are four tables in access and i need to copy data from these four tables to one table in oracle.There are also some tables in Oracle that have different field names than access.
Can i do this with Workbench.
This is not a supported use case for the Migration Workbench.
You could use our scripts generation feature to manually create the schema creation and data migration scripts, then update the scripts (rename the tables), load them cleanly, then use SQL to populate into your existing tables.

Oracle Data Integrator Interface change to a different database

Hi All, Please can I know what needs to be changed in ODI interface, if the source/target database is changed from Oracle to any other db. Thanks
These are some of high level points I can think of. First of all create a new Data Server under appropriate technology.- Create a Logical Schema and attach it to the Physical Schema- Navigate to Designer and open the Model linked with Oracle Schema- Change Logical Schema and reverse the objects (tables, views etc)- Check the interfaces if they are using correct KMs. (Like if you are using Oracle Incremental Update as IKM, with MS SQL database you may want to use MSSQL Incremental Update).- Check the staging area in interface, if selected then ensure its pointing to correct logical schema.- Check any procedures where you might have selected logical schema and technology- Additionally if using any oracle built-in functions like TO_DATE, may have to change to CAST.

Help with Oracle Entity Framework and synonyms

I have some troubles using schemas and synonyms of tables. It means that we are working with diferent schemas, one of those is the owner of tables and other one use synonyms with permissions to access, modify and delete in these tables, this one is the schema that the application use. So whats the way to use this synonyms to tables using the application schema with Entity Framework?
While I'm using the application data schema I couldn't get data from tables, because the application schema doesn't have tables just use synonyms to them.
Thanks in advanced
I have the same problem. I have created a data connection in the VS2010 server explorer to my oracle database. In the filter I have added the schemas that my user id has access to and the tables are shown in the server explorer. However, when I go through the ADO.NET entity wizard, the tables are not shown. 
Same problem here.

Mixed context problem

Hi! I have a problem with IKM SQL Control Append (Append Mode) module and different context's. My target and source table have the same logical schema. However I have two contexts, development that looks on a development schema on TDW base, and prod_test that looks on another schema that on TDW base that consists of masked views that look on masked production data on PDW base (via static DB link). I execute the interface with prod_test context, my source is on execution context and target has hardcoded development context.For reasons unknown, source table is resolved to development context.When I change my target table to work and execute the interface again, source table is resolved to prod_test context.Optimization context is development.Does anyone have any idea why the odiRef.getTable("L","INT_NAME","A") behaves like this? Regards,Ivan
Hi IvanWhy don't you create two logical schema, one for the schema on your dev server, and one for the schema on your prod_test, and then use the source and target from two different models. In this case, you will need a single context to map both the logical schema to different physical schema.That is the simple way of doing such load. About your questions, odiRef.getTable("L","INT_NAME","A"): this creates the Integration table in the Staging area. The staging area is usually the target logical schema, but the physical schema is decided by the Execution context. Optimization context is used only to validate the joins and mappings during the development. It has no significance during the execution. Anyways, I suggest you to take the simple path and use two logical schema instead of 1, since you have two data servers. Also, if the contexts are hard coded, such a scenario cannot be delivered to another environment. RegardsNishikant
Hi Nishikant, The reason I have the same logical schema is that the developed job will get exported to production where i have only one context and both the source and target tables will be from the same physical schema. Regards,Ivan

Which is the right way of selecting a schema in DDL generated by Datamodeler?

I am using SQL Datamodeler to generate both ERD and DDL for Oracle 11g r2 (Oracle Application Express). My generated DDL lacks a schema definition: which would be the best way to assign a given schema so created objects belong to the right schema?I am later loading the DDL via SQL Developer connection. Thanks,
M.Emmanuel wrote: I am using SQL Datamodeler to generate both ERD and DDL for Oracle 11g r2 (Oracle Application Express). My generated DDL lacks a schema definition: which would be the best way to assign a given schema so created objects belong to the right schema?I am later loading the DDL via SQL Developer connection. Thanks,The objects manipulated by the DDL will be owned by which ever USER executes the DDL.This is a benefit & not a problem which needs to be solved.
John as provided the answer that you need. If you want user Scott to own the objects from your ddl, then connect as Scott in SQL developer and execute your ddl as John suggested - under Scott user. Just make sure that your user (Scott) has the necessary privilege to create the types of objects that your ddl contain. 
Hellodepending if you define a physical model or not... If you do not have a physical model you can define the schema by first adding the schema wanted (right-click on Schema in Browser and add the schema) and then select the object attached to that schema from the list.If you do define a physical model (more preferrable) define a User in physical model and add that as a user to all the objects wanted. Note that you can just define it to one table and then use Propagate Properties to add it to the other tables. Note that a physical model must be open when generating the DDL, if it is not open the schema defined will not take affect. Best regards,Heli