-
Notifications
You must be signed in to change notification settings - Fork 0
/
README
153 lines (112 loc) · 7.82 KB
/
README
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
Repo layout
===========
deployments/ - location for built wars (Details below)
src/ - Maven src structure
pom.xml - Maven build file
.openshift/ - location for openshift specific files
.openshift/config/ - location for configuration files such as standalone.xml (used to modify jboss config such as datasources)
.openshift/action_hooks/pre_build - Script that gets run every git push before the build (on the CI system if available)
.openshift/action_hooks/build - Script that gets run every git push as part of the build process (on the CI system if available)
.openshift/action_hooks/deploy - Script that gets run every git push after build but before the app is restarted
.openshift/action_hooks/post_deploy - Script that gets run every git push after the app is restarted
Notes about layout
==================
Note: Every time you push, everything in your remote repo dir gets recreated
please store long term items (like an sqlite database) in the OpenShift
data directory, which will persist between pushes of your repo.
The OpenShift data directory is accessible relative to the remote repo
directory (../data) or via an environment variable OPENSHIFT_DATA_DIR.
Details about layout and deployment options
==========================================
There are two options for deploying content to the JBoss Application Server within OpenShift:
1) (Preferred) You can upload your content in a Maven src structure as is this sample project and on
git push have the application built and deployed. For this to work you'll need your pom.xml at the
root of your repository and a maven-war-plugin like in this sample to move the output from the build
to the deployments directory. By default the warName is ROOT within pom.xml. This will cause the
webapp contents to be rendered at http://app_name-namespace.rhcloud.com/. If you change the warName in
pom.xml to app_name, your base url would then become http://app_name-namespace.rhcloud.com/app_name.
Note: If you are building locally you'll also want to add any output wars/ears under deployments
from the build to your .gitignore file.
or
2) You can git push prebuilt wars (with the corresponding .dodeploy file for exploded wars) into deployments/. To do this
with the default repo you'll want to first run 'git rm -r src/ pom.xml' from the root of your repo.
Basic workflows for deploying prebuilt content (each operation will require associated git add/commit/push operations to take effect):
A) Add new zipped content and deploy it:
1. cp target/example.war deployments/
B) Add new unzipped content and deploy it:
1. cp -r target/example.war/ deployments/
2. touch deployments/example.war.dodeploy
C) Undeploy currently deployed content:
1. git rm deployments/example.war.dodeploy deployments/example.war
D) Replace currently deployed zipped content with a new version and deploy it:
1. cp target/example.war deployments/
E) Replace currently deployed unzipped content with a new version and deploy it:
1. git rm -rf deployments/example.war/
2. cp -r target/example.war/ deployments/
3. touch deployments/example.war.dodeploy
WARNING: If you go with option 2) there are a couple issues to keep in mind with both prebuilt and exploded
wars. With exploded wars the main issue is with committing binaries (class and jar files) can make merge
conflicts tedious. With prebuilt wars the main issue is each time you modify the war and git push, it
takes up the size of the war file away from your OpenShift file system quota. One alternative to this
(other then using Maven from option 1) is to use rsync to push your war into the deployments folder. You
would have to do this after each git push followed by 'rhc app restart -a appname'. Example:
rsync -avz localdir/deployments/ [email protected]:~/appname/repo/deployments/
Note: You can get the information in the uri above from running 'rhc domain show'
If you have already committed large files to your git repo, you rewrite or reset the history of those files in git
to an earlier point in time and then 'git push --force' to apply those changes on the remote OpenShift server. A
git gc on the remote OpenShift repo can be forced with (Note: tidy also does other cleanup including clearing log
files and tmp dirs):
rhc app tidy -a appname
Whether you choose option 1) or 2) the end result will be the application
deployed into the deployments directory. The deployments directory in the
JBoss Application Server distribution is the location end users can place
their deployment content (e.g. war, ear, jar, sar files) to have it
automatically deployed into the server runtime.
The filesystem deployment scanner in AS 7 and later works differently from
previous JBoss AS releases. The scanner will no longer attempt to directly
monitor the deployment content and decide if or when the end user wishes
the content to be deployed. Instead, the scanner relies on a system of marker
files, with the user's addition or removal of a marker file serving as a sort
of command telling the scanner to deploy, undeploy or redeploy content.
The marker files always have the same name as the deployment content to which
they relate, but with an additional file suffix appended. For example, the
marker file to indicate the example.war should be deployed is named
example.war.dodeploy. Different marker file suffixes have different meanings.
The relevant marker file types are:
.dodeploy -- Placed by the user to indicate that the given content should
be deployed into the runtime (or redeployed if already
deployed in the runtime.)
.deploying -- Placed by the deployment scanner service to indicate that it
has noticed a .dodeploy file and is in the process of
deploying the content. This marker file will be deleted when
the deployment process completes.
.deployed -- Placed by the deployment scanner service to indicate that the
given content has been deployed into the runtime. If an end
user deletes this file, the content will be undeployed.
.faileddeploy -- Placed by the deployment scanner service to indicate that the
given content failed to deploy into the runtime. The content
of the file will include some information about the cause of
the failure.
.undeploying -- Placed by the deployment scanner service to indicate that it
has noticed a .deployed file has been deleted and the
content is being undeployed. This marker file will be deleted
when the undeployment process completes.
.undeployed -- Placed by the deployment scanner service to indicate that the
given content has been undeployed from the runtime. If an end
user deletes this file, it has no impact.
Environment Variables
=====================
OpenShift provides several environment variables to reference for ease
of use. The following list are some common variables but far from exhaustive:
System.getenv("OPENSHIFT_GEAR_NAME") - Application name
System.getenv("OPENSHIFT_GEAR_DIR") - Application dir
System.getenv("OPENSHIFT_DATA_DIR") - For persistent storage (between pushes)
System.getenv("OPENSHIFT_TMP_DIR") - Temp storage (unmodified files deleted after 10 days)
When embedding a database using 'rhc app cartridge add', you can reference environment
variables for username, host and password:
System.getenv("OPENSHIFT_DB_HOST") - DB host
System.getenv("OPENSHIFT_DB_PORT") - DB Port
System.getenv("OPENSHIFT_DB_USERNAME") - DB Username
System.getenv("OPENSHIFT_DB_PASSWORD") - DB Password
To get a full list of environment variables, simply add a line in your
.openshift/action_hooks/build script that says "export" and push.