No Spring WebApplicationInitializer types detected on classpath

Wah, what a stupid error it was!

My Spring web app, which was running smoothly, refused to start. I don’t find any logs in log4j or tomcat log. Only clue I had was –

No Spring WebApplicationInitializer types detected on classpath

Here is how I solved it –

  1. Stop the tomcat
  2. Clean and build all eclipse projects
  3. Goto server tab and select the tomcat server. Press clean. Press clean work directory.
  4. Right click on the tomcat and remove it.
  5. Delete the tomcat from the eclipse run times
  6. Add tomcat server to the eclipse servers again
  7. Start the application

 

Advertisements

javax.validation.ValidationException: HV000183: Unable to load ‘javax.el.ExpressionFactory’. Check that you have the EL dependencies on the classpath, or use ParameterMessageInterpolator instead

I got this exception when I executed my newly written junit test for a Spring DAO with Hibernate Validations.

After adding javax.el to pom, this is resolved.


<dependency>
<groupId>javax.el</groupId>
<artifactId>javax.el-api</artifactId>
<version>2.2.4</version>
</dependency>
<dependency>
<groupId>org.glassfish.web</groupId>
<artifactId>javax.el</artifactId>
<version>2.2.4</version>
</dependency>

 

MySQL error 1449: The user specified as a definer does not exist

I faced this error when exporting the database from one server to other server, as the user doesn’t exist. So I changed the incorrect username into right one as given below.

W.R.T http://stackoverflow.com/questions/10169960/mysql-error-1449-the-user-specified-as-a-definer-does-not-exist

Execute this query to get the list of queries to be executed.


SELECT CONCAT("ALTER DEFINER=`youruser`@`host` VIEW ",
table_name, " AS ", view_definition, ";")
FROM information_schema.views
WHERE table_schema='your-database-name';

It would give list of queries as given below

ALTER DEFINER='jessica'@'%' VIEW vw_audit_log AS select `a`.`ID` AS `id`,`u`.`USER_NAME` AS `user_name`,`a`.`LOG_TYPE` AS `log_type`,`a`.`LOG_TIME` AS `log_time`,`a`.`MESSAGE` AS `message`,`a`.`STATUS` AS `status` from (`your-database-name`.`user_info` `u` join `your-database-name`.`audit_log` `a`) where (`u`.`ID` = `a`.`USER_ID`) order by `a`.`ID` desc;

ALTER DEFINER='jessica'@'%' VIEW vw_user_role AS select `ur`.`NAME` AS `ROLE_NAME`,`ur`.`EMAIL_PERMISSION` AS `EMAIL_PERMISSION`,`urm`.`user_id` AS `USER_ID`,
`urm`.`role_id` AS `ROLE_ID` from (`your-database-name`.`user_role` `ur` join `your-database-name`.`user_role_mapping` `urm`) where (`ur`.`ID` = `urm`.`role_id`);

ALTER DEFINER='jessica'@'%' VIEW vw_user_role_mapping AS select `ur`.`ROLE_NAME` AS `ROLE_NAME`,`ur`.`EMAIL_PERMISSION` AS `EMAIL_PERMISSION`,`ur`.`USER_ID` AS `USER_ID`,`ur`.`ROLE_ID` AS `ROLE_ID`,`ui`.`USER_NAME` AS `USER_NAME`,`ui`.`PASSWORD` AS `PASSWORD`,`ui`.`ENABLED` AS `ENABLED` from (`your-database-name`.`vw_user_role` `ur` join `your-database-name`.`user_info` `ui`) where (`ur`.`USER_ID` = `ui`.`ID`);

After executing this queries, the problem was resolved.

img_1522

Oozie job failure – Error: E0501 : E0501: Could not perform authorization operation, User: hadoop is not allowed to impersonate hadoop

Hi hadoopers,

I’m sorry for resuming the tutorials. I need to complete a project first. Tutorial posts would be resumed after this.

Today, I tried to form a workflow with Oozie. Here is the way I executed it.


hadoop@gandhari:/opt/hadoop-2.6.4/workspace/oozie$ ../../oozie/bin/oozie job --oozie http://gandhari:11000/oozie/ -Doozie.wf.application.path=hdfs://gandhari:9000/user/hadoop/feed/myflow.xml -dryrun

Unfortunately it is broken with the following error.


Error: E0501 : E0501: Could not perform authorization operation, User: hadoop is not allowed to impersonate hadoop

oozie_workflow

hadoop is my OS user. It is the user who is running the Oozie daemon as well. core-site.xml should contain the following entry to proxy this user.


<property>
<name>hadoop.proxyuser.hadoop.groups</name>
<value>*</value>
</property>

<property>
<name>hadoop.proxyuser.hadoop.hosts</name>
<value>gandhari</value>
</property>
</configuration>

hadoop – OS user name

gandhari – hostname

 

java.io.IOException: Filesystem closed

Hi hadoopers,

Here is the exception that screwed up me on Saturday night and failed my Mapper task.

  • Mapper is reading the lines one by one and tokenize it.
  • The last token contains a path of a file in HDFS.
  • I need to open the file and read the contents.

For the above task, following is the flow I followed in the Mapper.

hadoop045-filesystem

Worse, my mapper failed with the following exception.

org.apache.hadoop.mapred.MapTask: Ignoring exception during close for org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader@1cb3ec38
java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
at org.apache.hadoop.hdfs.DFSInputStream.close(DFSInputStream.java:617)

Filesystem object is suppose to be global. When I close the filesystem, the Mapper input is also closed which breaks the complete flow. So I closed only the filestream, but I didn’t close the file system explicitly which resolved the problem.

Ref: https://github.com/linkedin/gobblin/issues/1219

 

ERROR: Can’t get master address from ZooKeeper; znode data == null

I was setting up HBase and ZooKeeper for a lab exercise. After configuring them, I launched status command in hbase shell. I ended up with below given error.

ERROR: Can't get master address from ZooKeeper; znode data == null

The dfs and yarn daemons should be running to get valid output from hbase shell. It is resolved after starting dfs and yarn daemons.