Hello Everyone
Once again, thank you for your replies.
@Barton:
Well, this certainly clarified some of my expressions. After trying out some
basics, i just cracked on with the problem taking a look in the manual
whenever it was needed...A thorough read of the introduction though seems to
be very useful :-)
@Robert:
Thanks, i will look into the simplification part.
"Well, it's customary to pronounce a magical incantation..."
:-D Indeed. This i understand, i mean, i am not confused by the idea of the
product of the pdf's (i.i.d) which is converted to a sum due to the log. The
part i am not clear with is the summation itself:
While i would expect:
f(x,A,B)=A*exp(-B*x); (1)
L(X,A,B)=lsum(log(f(x,A,B),x,X); (2)
dL_dA(X,A,B)=diff(L(X,A,B),A); (3)
Now, Eq 3 is: the diff of the sum OR the sum of the
diff(log(A*exp(-B*x)),A). However, when you look at the example i posted
earlier it is worked out as something like: A^(n/2)*exp(lsum(x,x,X)*B)
Where the summation has gone inside the exp... :-/ (??)
By the way, the book that you recommend is available for a preview through
google books: http://books.google.co.uk/books?id=TNYhnkXQSjAC . I personally
liked that it is a balanced book between theory and practice. I have found
these two useful as well so far:
http://eu.wiley.com/WileyCDA/WileyTitle/productCd-0470090138.html and
http://www.amazon.com/Tools-Statistical-Inference-Exploration-Distributions/dp/0387946888
Regarding the problem i was looking into, it seems that it can not be solved
with a direct application of MLE :-( . There are numerical solutions though
either through application of Newton's successive approximations or
Nonlinear Regression. I tried newton's method yesterday on Maxima and got it
working but unfortunately convergence depends on the initial conditions :-/
Regarding the mailing list, is it possible for one of the administrators to
add an alias to this account? I noticed that it complains because the
message originates from something at googlemail.com and not something at
gmail.com which was how i registered....I tried the mailing list's "control
panel" but could not find anything. If it is not possible i will just
unsubscribe and re-subscribe. Please let me know.
All the best
Athanasios
On Tue, Mar 17, 2009 at 3:22 PM, Robert Dodier <robert.dodier at gmail.com>wrote:
> On 3/17/09, Athanasios Anastasiou <athanastasiou at googlemail.com> wrote:
>
> > Recommendation #1 was OK, but i lost you in #2 :-D
> > I need to do a bit more reading about simplification rules.
>
> Well, Maxima has two modes for generating new expressions from
> old ones; one is substitution of something for a symbol (this is called
> evaluation) and the other is application of formal identities
> (this is called simplification). For the most part simplification is
> carried out by Lisp code but there is a user-visible interface into
> the simplification system via tellsimp and friends.
> Take a look at the reference documentation for matchdeclare
> and tellsimpafter. (It is pretty dense, but I don't know any simpler
> way to explain it.)
>
> > "Continuous distribution, continuous parameter space". It starts by the
> > product of the pdf values for each X[i] (This is alright) but it ends up
> > with a sum in the exponent of %e (???).
>
> Well, it's customary to pronounce a magical incantation,
> "identical independent distribution" which causes the joint
> density of all the data to factor into the product of the density
> for each datum (and these terms have all the same form).
> So if each term is like exp(foo[i]) then the product of them is
> like exp(sum(foo[i], ...)). It's often easier to work with the
> logarithm of that, so you get just the sum.
>
> Hope this helps. By the way if you would like an introductory
> text, I recommend "Bayesian Data Analysis" by Gelman, Carlin,
> Stern, & Rubin.
>
> best
>
> Robert Dodier
>