I’m doing upgrading Spring 4 to Spring 5 and Hibernate 4 to 5.
My existing functionality refused to work. It broke with the following exception.
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table ‘crm.hibernate_sequence’ doesn’t exist
I’m using MariaDB/MySQL. I don’t have to maintain any sequence tables as Oracle. In addition, it was working without any issues. All my primary keys are autogenerated with the following annotations.
private Long id;
Thanks to https://stackoverflow.com/questions/32968527/hibernate-sequence-doesnt-exist
Adding hibernate.id.new_generator_mappings=false to Hibernate properties solved the problem.
Wah, what a stupid error it was!
My Spring web app, which was running smoothly, refused to start. I don’t find any logs in log4j or tomcat log. Only clue I had was –
No Spring WebApplicationInitializer types detected on classpath
Here is how I solved it –
- Stop the tomcat
- Clean and build all eclipse projects
- Goto server tab and select the tomcat server. Press clean. Press clean work directory.
- Right click on the tomcat and remove it.
- Delete the tomcat from the eclipse run times
- Add tomcat server to the eclipse servers again
- Start the application
I’m writing a unit test for my spring DAO, which includes both write and read test cases. Here is the exception that broke the test execution.
org.hibernate.HibernateException: No Session found for current thread
Any such operations needs @Transactional annotations.
I got this exception when I executed my newly written junit test for a Spring DAO with Hibernate Validations.
After adding javax.el to pom, this is resolved.
I faced this error when exporting the database from one server to other server, as the user doesn’t exist. So I changed the incorrect username into right one as given below.
Execute this query to get the list of queries to be executed.
SELECT CONCAT("ALTER DEFINER=`youruser`@`host` VIEW ",
table_name, " AS ", view_definition, ";")
It would give list of queries as given below
ALTER DEFINER='jessica'@'%' VIEW vw_audit_log AS select `a`.`ID` AS `id`,`u`.`USER_NAME` AS `user_name`,`a`.`LOG_TYPE` AS `log_type`,`a`.`LOG_TIME` AS `log_time`,`a`.`MESSAGE` AS `message`,`a`.`STATUS` AS `status` from (`your-database-name`.`user_info` `u` join `your-database-name`.`audit_log` `a`) where (`u`.`ID` = `a`.`USER_ID`) order by `a`.`ID` desc;
ALTER DEFINER='jessica'@'%' VIEW vw_user_role AS select `ur`.`NAME` AS `ROLE_NAME`,`ur`.`EMAIL_PERMISSION` AS `EMAIL_PERMISSION`,`urm`.`user_id` AS `USER_ID`,
`urm`.`role_id` AS `ROLE_ID` from (`your-database-name`.`user_role` `ur` join `your-database-name`.`user_role_mapping` `urm`) where (`ur`.`ID` = `urm`.`role_id`);
ALTER DEFINER='jessica'@'%' VIEW vw_user_role_mapping AS select `ur`.`ROLE_NAME` AS `ROLE_NAME`,`ur`.`EMAIL_PERMISSION` AS `EMAIL_PERMISSION`,`ur`.`USER_ID` AS `USER_ID`,`ur`.`ROLE_ID` AS `ROLE_ID`,`ui`.`USER_NAME` AS `USER_NAME`,`ui`.`PASSWORD` AS `PASSWORD`,`ui`.`ENABLED` AS `ENABLED` from (`your-database-name`.`vw_user_role` `ur` join `your-database-name`.`user_info` `ui`) where (`ur`.`USER_ID` = `ui`.`ID`);
After executing this queries, the problem was resolved.
I’m sorry for resuming the tutorials. I need to complete a project first. Tutorial posts would be resumed after this.
Today, I tried to form a workflow with Oozie. Here is the way I executed it.
hadoop@gandhari:/opt/hadoop-2.6.4/workspace/oozie$ ../../oozie/bin/oozie job --oozie http://gandhari:11000/oozie/ -Doozie.wf.application.path=hdfs://gandhari:9000/user/hadoop/feed/myflow.xml -dryrun
Unfortunately it is broken with the following error.
Error: E0501 : E0501: Could not perform authorization operation, User: hadoop is not allowed to impersonate hadoop
hadoop is my OS user. It is the user who is running the Oozie daemon as well. core-site.xml should contain the following entry to proxy this user.
hadoop – OS user name
gandhari – hostname
Here is the exception that screwed up me on Saturday night and failed my Mapper task.
- Mapper is reading the lines one by one and tokenize it.
- The last token contains a path of a file in HDFS.
- I need to open the file and read the contents.
For the above task, following is the flow I followed in the Mapper.
Worse, my mapper failed with the following exception.
org.apache.hadoop.mapred.MapTask: Ignoring exception during close for org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader@1cb3ec38
java.io.IOException: Filesystem closed
Filesystem object is suppose to be global. When I close the filesystem, the Mapper input is also closed which breaks the complete flow. So I closed only the filestream, but I didn’t close the file system explicitly which resolved the problem.