[R] Fwd: Re: Help with winbugs code

Jim Lemon jim at bitwrit.com.au
Thu Jun 23 12:26:55 CEST 2011


Hi folks,
I'm forwarding this to the list as my email to nita was about getting 
her code to the list. Additionally, I'm running Linux and have no 
experience with WinBUGS.

Jim

-------- Original Message --------
Subject: 	Re: [R] Help with winbugs code
Date: 	Thu, 23 Jun 2011 16:49:33 +0700
From: 	nita yalina <tayalin at gmail.com>
To: 	Jim Lemon <jim at bitwrit.com.au>



Thanks to reply my message....i really appreciate that..here is my code:
i also attach a text file. in that code I initial varible "y" in the
initial part, but it make winbugs open a new window "undefine real
result" but when I delete variable "y" in the initial part it said that
there some variable that has to be initialized. what should i do?


...very grateful for your help...

model{
for(i in 1:N){
#model persamaan pengukuran

      for(j in 1:P){
      y[i,j]~dnorm(mu[i,j],psi [j])   I(thd [j,z[i,j]],thd[j,z[i,j]+1])
          ephat[i,j]<-y[i,j] -mu[i,j]

      }


#faktor Budaya Organisasi
mu[i,1]<-xi[i,1]
mu[i,2]<-lam[1]*xi[i,1]
mu[i,3]<-lam[2]*xi[i,1]

#faktor Kemampuan Pengguna
mu[i,4]<-xi[i,2]
mu[i,5]<-lam[3]*xi[i,2]
mu[i,6]<-lam[4]*xi[i,2]

#faktor Mekanisme Dukungan
mu[i,7]<-xi[i,3]
mu[i,8]<-lam[5]*xi[i,3]
mu[i,9]<-lam[6]*xi[i,3]

#faktor Desain Antarmuka
mu[i,10]<-xi[i,4]
mu[i,11]<-lam[7]*xi[i,4]
mu[i,12]<-lam[8]*xi[i,4]

#faktor Persepsi Kualitas
mu[i,13]<-xi[i,5]
mu[i,14]<-lam[9]*xi[i,5]
mu[i,15]<-lam[10]*xi[i,5]

#faktor Persepsi Kemudahan Kegunaan
mu[i,16]<-eta[i,1]
mu[i,17]<-lam[11]*eta[i,1]
mu[i,18]<-lam[12]*eta[i,1]

#faktor Persepsi Kegunaan
mu[i,19]<-eta[i,2]
mu[i,20]<-lam[13]*eta[i,2]
mu[i,21]<-lam[14]*eta[i,2]
mu[i,22]<-lam[15]*eta[i,2]

#faktor Sikap ke arah Penggunaan
mu[i,23]<-eta[i,3]
mu[i,24]<-lam[16]*eta[i,3]
mu[i,25]<-lam[17]*eta[i,3]

#faktor Persepsi Niat untuk Menggunakan
mu[i,26]<-eta[i,4]
mu[i,27]<-lam[18]*eta[i,4]
mu[i,28]<-lam[19]*eta[i,4]

#faktor Adopsi E-government
mu[i,29]<-eta[i,5]
mu[i,30]<-lam[20]*eta[i,5]

#model persamaan struktural
xi[i,1:5] ~dmnorm(u[1:5],phi[1:5,1:5])


eta[i,1]~dnorm(nu[i,1],pskp)
nu[i,1]<-gam[1]*xi[i,2]+gam[2]*xi[i,3]+gam[3]*xi[i,4]
dthat[i,1]<-eta[i,1]-nu[i,1]

eta[i,2]~dnorm(nu[i,2],pspk)
nu[i,2]<-gam[4]*xi[i,1]+beta[1]*eta[i,1]
dthat[i,2]<-eta[i,2]-nu[i,2]

eta[i,3]~dnorm(nu[i,3],pssp)
nu[i,3]<-beta[2]*eta[i,2]+beta[3]*eta[i,3]
dthat[i,3]<-eta[i,3]-nu[i,3]

eta[i,4]~dnorm(nu[i,4],psnm)
nu[i,4]<-beta[4]*eta[i,1]+beta[5]*eta[i,2]+gam[5]*xi[i,5]
dthat[i,4]<-eta[i,4]-nu[i,4]

eta[i,5]~dnorm(nu[i,5],psae)
nu[i,5]<-beta[6]*eta[i,4]
dthat[i,5]<-eta[i,5]-nu[i,5]
}#akhir dari i

for (i in 1:5) {u[i]<-0.0}


#lamda
var.lam[1]<-8.0*psi[2]   var.lam[2]<-8.0*psi[3]

var.lam[3]<-8.0*psi[5]   var.lam[4]<-8.0*psi[6]

var.lam[5]<-8.0*psi[8]   var.lam[6]<-8.0*psi[9]

var.lam[7]<-8.0*psi[11]    var.lam[8]<-8.0*psi[12]

var.lam[9]<-8.0*psi[14] var.lam[10]<-8.0*psi[15]

var.lam[11]<-8.0*psi[17]   var.lam[12]<-8.0*psi[18]
var.lam[13]<-8.0*psi[20]

var.lam[14]<-8.0*psi[21]   var.lam[15]<-8.0*psi[22]

var.lam[16]<-8.0*psi[24]    var.lam[17]<-8.0*psi[25]

var.lam[18]<-8.0*psi[27]  var.lam[19]<-8.0*psi[28]
var.lam[20]<-8.0*psi[30]

for (i in 1:20) {lam[i] ~dnorm(1,var.lam[i])}
for (j in 1:P) {
psi[j] ~dgamma(10,8)
sgl[j]<-1/psi[j]
}

#gamma
gam[1]~dnorm(0.4,var.pk <http://var.pk>)
gam[2]~dnorm(0.5,var.kp <http://var.kp>)
gam[3]~dnorm(0.4,var.kp <http://var.kp>)
gam[4]~dnorm(0.6,var.kp <http://var.kp>)
gam[5]~dnorm(0.1,var.nm)

var.pk <http://var.pk> <-8.0*pspk pspk~dgamma(10,8) sgpk<-1/pspk
var.kp <http://var.kp> <-8.0*pskp pskp~dgamma(10,8) sgkp<-1/pskp
var.sp <-8.0*pssp pssp~dgamma(10,8) sgsp<-1/pssp
var.nm <-8.0*psnm psnm~dgamma(10,8) sgnm<-1/psnm
var.ae <http://var.ae> <-8.0*psae psae~dgamma(10,8) sgae<-1/psae

#beta
beta[1] ~dnorm(0.4,var.pk <http://var.pk>)
beta[2] ~dnorm(0.5,var.sp)
beta[3] ~dnorm(0.6,var.sp)
beta[4] ~dnorm(0.6,var.nm)
beta[5] ~dnorm(0.5,var.nm)
beta[6] ~dnorm(0.4,var.ae <http://var.ae>)

phi[1:5,1:5] ~dwish(R[1:5,1:5],30)
phx[1:5,1:5]<-inverse(phi[1:5,1:5])


}
#end of model

DATA
list(N=43, P=30,
R=structure(
.Data=c(10,0,0,0,0,
0,10,0,0,0,
0,0,10,0,0,
0,0,0,10,0,
0,0,0,0,10
),
.Dim=c(5,5)),
thd=structure(
.Data=c(-250,-1.99072046156042,-1.08241139430414,0.983052916910141,250,-250,-200,-1.47752529199845,0.452147411138078,250,-250,-200,-1.08241139430414,0.730448177619092,250,-250,-1.99072046156042,-0.585607161227169,1.08241139430414,250,-250,-1.99072046156042,-0.265404753825216,1.08241139430414,250,-250,-200,-0.656304990872144,0.892559673266593,250,-250,-200,-1.08241139430414,0.517723552818072,250,-250,-1.99072046156042,-1.47752529199845,0.585607161227169,250,-250,-200,-1.99072046156042,0.265404753825216,250,-250,-200,-1.67966118528897,0.656304990872144,250,-250,-1.99072046156042,-1.67966118528897,0.517723552818072,250,-250,-1.67966118528897,-0.892559673266593,0.730448177619092,250,-250,-200,-1.08241139430414,0.80884440410662,250,-250,-200,-1.67966118528897,0.730448177619092,250,-250,-1.67966118528897,-1.19379507272265,0.80884440410662,250,-250,-200,-1.32236537894944,0.98305291691014,250,-250,-200,-0.656304990872144,1.08241139430414,250,-250,-1.99072046156042,-1.32236537894944,0
.585607161227169,250,-250,-200,-1.19379507272265,0.656304990872144,250,-250,-1.99072046156042,-1.67966118528897,0.80884440410662,250,-250,-1.99072046156042,-1.19379507272265,0.656304990872144,250,-250,-1.67966118528897,-1.19379507272265,0.452147411138078,250,-250,-200,-1.47752529199845,0.656304990872144,250,-250,-1.99072046156042,-1.47752529199845,0.585607161227169,250,-250,-200,-1.19379507272265,0.730448177619092,250,-250,-1.67966118528897,-0.80884440410662,0.892559673266593,250,-250,-1.99072046156042,-1.67966118528897,0.585607161227169,250,-250,-200,-0.983052916910141,0.80884440410662,250,-250,-200,-1.67966118528897,0.585607161227169,250,-250,-1.99072046156042,-1.08241139430414,0.892559673266593,250),
.Dim=c(30,5)),
z=structure(
.Data=c(3,3,2,3,2,2,4,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,2,2,3,3,3,3,3,2,3,3,3,3,2,3,2,2,2,2,2,2,2,3,3,3,3,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,2,2,3,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,3,4,3,3,3,3,3,2,3,4,2,4,4,4,4,3,3,3,3,2,2,3,3,3,3,3,3,3,3,3,3,3,3,4,4,4,2,2,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,4,3,3,3,3,3,3,3,3,3,2,2,2,3,3,3,3,3,4,3,2,3,3,3,3,4,3,2,4,3,3,3,3,3,3,3,3,3,2,3,2,2,2,2,3,3,4,3,3,2,3,3,3,3,2,4,3,3,4,4,3,3,2,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,2,3,4,4,2,2,3,3,4,4,4,4,3,3,3,3,3,4,3,3,3,3,3,4,4,4,3,4,4,4,4,3,3,3,3,3,3,4,3,3,3,3,2,3,3,1,4,4,4,2,3,3,3,3,3,3,1,3,3,3,3,2,3,3,2,2,2,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,2,3,2,4,3,2,2,2,2,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,2,2,2,2,3,3,3,4,4,4,4,3,3,2,4,3,3,3,2,3,4,2,4,4,3,3,3,3,3,3,3,3,3,3,3,3,3,3,
3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,2,3,3,3,3,2,4,4,4,3,3,3,3,3,3,3,3,4,4,3,3,3,3,4,3,1,4,3,4,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,2,3,3,3,3,3,2,3,2,3,2,3,4,3,3,3,3,3,4,4,3,3,3,3,3,3,3,3,3,3,3,4,3,3,3,4,3,3,4,3,3,3,3,3,3,2,2,3,3,4,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,4,4,3,4,4,3,4,4,4,4,3,3,3,3,3,3,3,3,3,4,4,3,3,3,3,3,3,3,3,3,2,3,4,3,2,3,1,3,3,3,3,2,2,2,3,3,2,3,2,2,2,4,4,4,2,3,2,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,2,2,3,3,2,3,3,2,2,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,4,4,3,3,2,2,1,1,2,2,2,2,2,1,1,2,2,2,1,1,1,2,2,2,2,2,2,2,2,2,3,2,1,1,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,3,4,4,3,3,4,4,3,3,4,4,4,3,3,3,4,4,4,3,3,4,4,4,4,4,4,4,4,4,4,3,3,3,2,2,3,4,4,4,3,3,3,4,4,4,3,3,4,3,3,3,3,3,3,3,3,
4,3,4,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,2,3,3,3,3,2,3,3,3,3,3,3,3,2,3,3,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,3,4,3,3,3,3,3,3,4,3,3,3,4,4,4,3,4,3,3,3,3,4,3,3,3,3,3,3,3,3,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,4,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,4,3,2,3,4,2,3,4,3,2,1,3,4,1,3,2,1,3,2,1,2,2,1,2,2,1,2,3,4,2,2,3,2,2,2,3,3,3,3,3,3,3,3,3,3,3,3,3,2,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,3,4,3,3,2,3,4,4,4,4,4,4,3,3,4,4,4,4,3,3,4,4,4,4,3,3,4,3,4,3
),
.Dim=c(43,30)))
)

INITS
list(
lam=c(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0),
psi=c(1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0,1.0),
pspk=1.0, pskp=1.0, pssp=1.0, psnm=1.0, psae=1.0,
gam=c(0,0,0,0,0),
beta=c(0,0,0,0,0,0),
phi=structure(.Data=c(1,0,0,0,0,
0,1,0,0,0,
0,0,1,0,0,
0,0,0,1,0,
0,0,0,0,1
),.Dim=c(5,5)),
xi=structure(.Data=c(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0), 


.Dim=c(43,5)),
eta=structure(.Data=c(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0),
.Dim=c(43,5)),
y=structure(
.Data=c(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,
0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
),
.Dim=c(43,30))

)

On Thu, Jun 23, 2011 at 4:36 PM, Jim Lemon <jim at bitwrit.com.au
<mailto:jim at bitwrit.com.au>> wrote:

     On 06/22/2011 05:29 PM, nita yalina wrote:

         Good afternoon sir,
         i'm a student in Indonesia who study technology management, i
         need to review
         the software i'v been develop, i was told to use bayesian SEM
         because i
         don't have large sample. i don't know much about statistics but
         i try to
         make a code via winbugs.. and a i got a problem. here i attach
         my winbugs
         code and my model. could you hep me to find what's wrong with my
         code? thank
         you very much for your answer

     Hi nita,
     Your code didn't make it to the list. If it is plain text, just
     paste it into the message, but I suspect that you tried to send a
     Word document or something like that. If so, copy and paste the
     contents of whatever it was into Notepad, then save that as a .txt
     file and it should get through the filters.

     Jim


-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: semmodel.txt
URL: <https://stat.ethz.ch/pipermail/r-help/attachments/20110623/e6579fb9/attachment.txt>


More information about the R-help mailing list