Vector Space as the set of solutions of matrix equation $AX=O$












2












$begingroup$


One of my professor's lecture notes on Vector Spaces start by the following lines:-



We have seen that if $det(A)$ = 0, then system $AX=O$ has infinite number of solutions. We shall now see that in this case, the set of solutions has a structure called vector space.



My doubt is in what sense do the set of an infinite number of the solution of equation $AX = O$ (given |A|=0) is actually a structure of Vector Space? How does the term Vector Space come into picture?










share|cite|improve this question











$endgroup$

















    2












    $begingroup$


    One of my professor's lecture notes on Vector Spaces start by the following lines:-



    We have seen that if $det(A)$ = 0, then system $AX=O$ has infinite number of solutions. We shall now see that in this case, the set of solutions has a structure called vector space.



    My doubt is in what sense do the set of an infinite number of the solution of equation $AX = O$ (given |A|=0) is actually a structure of Vector Space? How does the term Vector Space come into picture?










    share|cite|improve this question











    $endgroup$















      2












      2








      2





      $begingroup$


      One of my professor's lecture notes on Vector Spaces start by the following lines:-



      We have seen that if $det(A)$ = 0, then system $AX=O$ has infinite number of solutions. We shall now see that in this case, the set of solutions has a structure called vector space.



      My doubt is in what sense do the set of an infinite number of the solution of equation $AX = O$ (given |A|=0) is actually a structure of Vector Space? How does the term Vector Space come into picture?










      share|cite|improve this question











      $endgroup$




      One of my professor's lecture notes on Vector Spaces start by the following lines:-



      We have seen that if $det(A)$ = 0, then system $AX=O$ has infinite number of solutions. We shall now see that in this case, the set of solutions has a structure called vector space.



      My doubt is in what sense do the set of an infinite number of the solution of equation $AX = O$ (given |A|=0) is actually a structure of Vector Space? How does the term Vector Space come into picture?







      linear-algebra vector-spaces matrix-equations matrix-calculus






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited Jan 30 at 19:57







      Onkar Singh

















      asked Jan 29 at 18:56









      Onkar SinghOnkar Singh

      297




      297






















          3 Answers
          3






          active

          oldest

          votes


















          2












          $begingroup$

          What your professor means is that the set of all $X$'s that satisfy $AX=0$ satisfies the properties of a vector space. You have to check all the ten axioms of a vector space. Here is only a few:



          It has an identity under vector addition. Namely, $X=(0,0,cdots,0)^t$.



          It's closed under addition because $A(X+Y)=AX+AY=0$ and if $X,Y$ are in the set, so is $X+Y$



          It's closed under scalar multiplication because $A(lambda X)=lambda AX=0$.



          And you can check all other axioms one by one. Another way to avoid checking many of the axioms, is that if you already have shown that $mathbb{F}^n$ is a vector space for any field $F$. Then, since our set is a subset of that, associativity and many other properties are inherited from the superset naturally. And we only need to show that $alpha X + beta Y$ is in the set which follows from what we proved about being closed under addition and scalar multiplication.






          share|cite|improve this answer









          $endgroup$













          • $begingroup$
            Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
            $endgroup$
            – Onkar Singh
            Jan 29 at 19:11






          • 1




            $begingroup$
            @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
            $endgroup$
            – stressed out
            Jan 29 at 19:13










          • $begingroup$
            Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
            $endgroup$
            – Onkar Singh
            Jan 29 at 19:17












          • $begingroup$
            @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
            $endgroup$
            – stressed out
            Jan 29 at 19:19



















          1












          $begingroup$

          It's simply the fact that linear combinations of solutions are again a solution: if $AX=0$ and $AY=0$, then
          $$
          A(alpha X+beta Y)=alpha AX+beta AY=0.
          $$

          Thus, the set ${X: AX=0}$ is a vector space with the natural vector operations.






          share|cite|improve this answer









          $endgroup$





















            1












            $begingroup$

            You must look into the actual definition of a vector space... In this this case, since the set of solutions of $AX=0$ is a subset of $mathbb{R}^n$, you just need to check that it is closed with respect to the sum of vectors and to the multiplication by scalars. Take $u,v in mathbb{R}^n$ such that $Au = Av = 0$ (basically any two solutions of $AX=0$) and $alpha in mathbb{R}$. If you prove that $u+v$ and $alpha u$ are also solutions of $AX=0$, you have your result. This is very straightforward to check:
            $$ A(u+v)= Au + Av = 0 + 0 = 0$$



            $$A(alpha u) = alpha (Au) = alpha cdot 0 = 0$$






            share|cite|improve this answer









            $endgroup$














              Your Answer





              StackExchange.ifUsing("editor", function () {
              return StackExchange.using("mathjaxEditing", function () {
              StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
              StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
              });
              });
              }, "mathjax-editing");

              StackExchange.ready(function() {
              var channelOptions = {
              tags: "".split(" "),
              id: "69"
              };
              initTagRenderer("".split(" "), "".split(" "), channelOptions);

              StackExchange.using("externalEditor", function() {
              // Have to fire editor after snippets, if snippets enabled
              if (StackExchange.settings.snippets.snippetsEnabled) {
              StackExchange.using("snippets", function() {
              createEditor();
              });
              }
              else {
              createEditor();
              }
              });

              function createEditor() {
              StackExchange.prepareEditor({
              heartbeatType: 'answer',
              autoActivateHeartbeat: false,
              convertImagesToLinks: true,
              noModals: true,
              showLowRepImageUploadWarning: true,
              reputationToPostImages: 10,
              bindNavPrevention: true,
              postfix: "",
              imageUploader: {
              brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
              contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
              allowUrls: true
              },
              noCode: true, onDemand: true,
              discardSelector: ".discard-answer"
              ,immediatelyShowMarkdownHelp:true
              });


              }
              });














              draft saved

              draft discarded


















              StackExchange.ready(
              function () {
              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3092576%2fvector-space-as-the-set-of-solutions-of-matrix-equation-ax-o%23new-answer', 'question_page');
              }
              );

              Post as a guest















              Required, but never shown

























              3 Answers
              3






              active

              oldest

              votes








              3 Answers
              3






              active

              oldest

              votes









              active

              oldest

              votes






              active

              oldest

              votes









              2












              $begingroup$

              What your professor means is that the set of all $X$'s that satisfy $AX=0$ satisfies the properties of a vector space. You have to check all the ten axioms of a vector space. Here is only a few:



              It has an identity under vector addition. Namely, $X=(0,0,cdots,0)^t$.



              It's closed under addition because $A(X+Y)=AX+AY=0$ and if $X,Y$ are in the set, so is $X+Y$



              It's closed under scalar multiplication because $A(lambda X)=lambda AX=0$.



              And you can check all other axioms one by one. Another way to avoid checking many of the axioms, is that if you already have shown that $mathbb{F}^n$ is a vector space for any field $F$. Then, since our set is a subset of that, associativity and many other properties are inherited from the superset naturally. And we only need to show that $alpha X + beta Y$ is in the set which follows from what we proved about being closed under addition and scalar multiplication.






              share|cite|improve this answer









              $endgroup$













              • $begingroup$
                Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:11






              • 1




                $begingroup$
                @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
                $endgroup$
                – stressed out
                Jan 29 at 19:13










              • $begingroup$
                Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:17












              • $begingroup$
                @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
                $endgroup$
                – stressed out
                Jan 29 at 19:19
















              2












              $begingroup$

              What your professor means is that the set of all $X$'s that satisfy $AX=0$ satisfies the properties of a vector space. You have to check all the ten axioms of a vector space. Here is only a few:



              It has an identity under vector addition. Namely, $X=(0,0,cdots,0)^t$.



              It's closed under addition because $A(X+Y)=AX+AY=0$ and if $X,Y$ are in the set, so is $X+Y$



              It's closed under scalar multiplication because $A(lambda X)=lambda AX=0$.



              And you can check all other axioms one by one. Another way to avoid checking many of the axioms, is that if you already have shown that $mathbb{F}^n$ is a vector space for any field $F$. Then, since our set is a subset of that, associativity and many other properties are inherited from the superset naturally. And we only need to show that $alpha X + beta Y$ is in the set which follows from what we proved about being closed under addition and scalar multiplication.






              share|cite|improve this answer









              $endgroup$













              • $begingroup$
                Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:11






              • 1




                $begingroup$
                @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
                $endgroup$
                – stressed out
                Jan 29 at 19:13










              • $begingroup$
                Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:17












              • $begingroup$
                @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
                $endgroup$
                – stressed out
                Jan 29 at 19:19














              2












              2








              2





              $begingroup$

              What your professor means is that the set of all $X$'s that satisfy $AX=0$ satisfies the properties of a vector space. You have to check all the ten axioms of a vector space. Here is only a few:



              It has an identity under vector addition. Namely, $X=(0,0,cdots,0)^t$.



              It's closed under addition because $A(X+Y)=AX+AY=0$ and if $X,Y$ are in the set, so is $X+Y$



              It's closed under scalar multiplication because $A(lambda X)=lambda AX=0$.



              And you can check all other axioms one by one. Another way to avoid checking many of the axioms, is that if you already have shown that $mathbb{F}^n$ is a vector space for any field $F$. Then, since our set is a subset of that, associativity and many other properties are inherited from the superset naturally. And we only need to show that $alpha X + beta Y$ is in the set which follows from what we proved about being closed under addition and scalar multiplication.






              share|cite|improve this answer









              $endgroup$



              What your professor means is that the set of all $X$'s that satisfy $AX=0$ satisfies the properties of a vector space. You have to check all the ten axioms of a vector space. Here is only a few:



              It has an identity under vector addition. Namely, $X=(0,0,cdots,0)^t$.



              It's closed under addition because $A(X+Y)=AX+AY=0$ and if $X,Y$ are in the set, so is $X+Y$



              It's closed under scalar multiplication because $A(lambda X)=lambda AX=0$.



              And you can check all other axioms one by one. Another way to avoid checking many of the axioms, is that if you already have shown that $mathbb{F}^n$ is a vector space for any field $F$. Then, since our set is a subset of that, associativity and many other properties are inherited from the superset naturally. And we only need to show that $alpha X + beta Y$ is in the set which follows from what we proved about being closed under addition and scalar multiplication.







              share|cite|improve this answer












              share|cite|improve this answer



              share|cite|improve this answer










              answered Jan 29 at 19:00









              stressed outstressed out

              6,5131939




              6,5131939












              • $begingroup$
                Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:11






              • 1




                $begingroup$
                @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
                $endgroup$
                – stressed out
                Jan 29 at 19:13










              • $begingroup$
                Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:17












              • $begingroup$
                @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
                $endgroup$
                – stressed out
                Jan 29 at 19:19


















              • $begingroup$
                Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:11






              • 1




                $begingroup$
                @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
                $endgroup$
                – stressed out
                Jan 29 at 19:13










              • $begingroup$
                Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
                $endgroup$
                – Onkar Singh
                Jan 29 at 19:17












              • $begingroup$
                @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
                $endgroup$
                – stressed out
                Jan 29 at 19:19
















              $begingroup$
              Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
              $endgroup$
              – Onkar Singh
              Jan 29 at 19:11




              $begingroup$
              Seeing the examples of vector spaces, I realized that even polynomials can be a vector space. So, is it that the true notion of 'vector' is being lost here! I mean a polynomial can have no directional sense. Everyone talks about $R^n$, but how to bring that in here in case of a polynomial example?
              $endgroup$
              – Onkar Singh
              Jan 29 at 19:11




              1




              1




              $begingroup$
              @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
              $endgroup$
              – stressed out
              Jan 29 at 19:13




              $begingroup$
              @OnkarSingh You are correct. In mathematics, we deal with abstract objects. The fact that a vector space can have objects that are not geometric vectors should not bother you. Even though, it's unfortunate. Some authors use the term 'linear space' instead of 'vector space' because the objects are not necessarily vectors from the kind of geometry we're used to. Even the set of continuous real functions with pointwise addition and scalar multiplication form a vector space and therefore, one may call a function a vector! The terminology is unfortunate, but the idea of abstraction proves useful.
              $endgroup$
              – stressed out
              Jan 29 at 19:13












              $begingroup$
              Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
              $endgroup$
              – Onkar Singh
              Jan 29 at 19:17






              $begingroup$
              Ok!, so is this the notion of ABSTRACT algebra, that's what 'abstract' word stands for!
              $endgroup$
              – Onkar Singh
              Jan 29 at 19:17














              $begingroup$
              @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
              $endgroup$
              – stressed out
              Jan 29 at 19:19




              $begingroup$
              @OnkarSingh Exactly! Mathematicians study things abstractly. That's what makes us different from engineers or physicists.
              $endgroup$
              – stressed out
              Jan 29 at 19:19











              1












              $begingroup$

              It's simply the fact that linear combinations of solutions are again a solution: if $AX=0$ and $AY=0$, then
              $$
              A(alpha X+beta Y)=alpha AX+beta AY=0.
              $$

              Thus, the set ${X: AX=0}$ is a vector space with the natural vector operations.






              share|cite|improve this answer









              $endgroup$


















                1












                $begingroup$

                It's simply the fact that linear combinations of solutions are again a solution: if $AX=0$ and $AY=0$, then
                $$
                A(alpha X+beta Y)=alpha AX+beta AY=0.
                $$

                Thus, the set ${X: AX=0}$ is a vector space with the natural vector operations.






                share|cite|improve this answer









                $endgroup$
















                  1












                  1








                  1





                  $begingroup$

                  It's simply the fact that linear combinations of solutions are again a solution: if $AX=0$ and $AY=0$, then
                  $$
                  A(alpha X+beta Y)=alpha AX+beta AY=0.
                  $$

                  Thus, the set ${X: AX=0}$ is a vector space with the natural vector operations.






                  share|cite|improve this answer









                  $endgroup$



                  It's simply the fact that linear combinations of solutions are again a solution: if $AX=0$ and $AY=0$, then
                  $$
                  A(alpha X+beta Y)=alpha AX+beta AY=0.
                  $$

                  Thus, the set ${X: AX=0}$ is a vector space with the natural vector operations.







                  share|cite|improve this answer












                  share|cite|improve this answer



                  share|cite|improve this answer










                  answered Jan 29 at 19:00









                  Martin ArgeramiMartin Argerami

                  129k1184185




                  129k1184185























                      1












                      $begingroup$

                      You must look into the actual definition of a vector space... In this this case, since the set of solutions of $AX=0$ is a subset of $mathbb{R}^n$, you just need to check that it is closed with respect to the sum of vectors and to the multiplication by scalars. Take $u,v in mathbb{R}^n$ such that $Au = Av = 0$ (basically any two solutions of $AX=0$) and $alpha in mathbb{R}$. If you prove that $u+v$ and $alpha u$ are also solutions of $AX=0$, you have your result. This is very straightforward to check:
                      $$ A(u+v)= Au + Av = 0 + 0 = 0$$



                      $$A(alpha u) = alpha (Au) = alpha cdot 0 = 0$$






                      share|cite|improve this answer









                      $endgroup$


















                        1












                        $begingroup$

                        You must look into the actual definition of a vector space... In this this case, since the set of solutions of $AX=0$ is a subset of $mathbb{R}^n$, you just need to check that it is closed with respect to the sum of vectors and to the multiplication by scalars. Take $u,v in mathbb{R}^n$ such that $Au = Av = 0$ (basically any two solutions of $AX=0$) and $alpha in mathbb{R}$. If you prove that $u+v$ and $alpha u$ are also solutions of $AX=0$, you have your result. This is very straightforward to check:
                        $$ A(u+v)= Au + Av = 0 + 0 = 0$$



                        $$A(alpha u) = alpha (Au) = alpha cdot 0 = 0$$






                        share|cite|improve this answer









                        $endgroup$
















                          1












                          1








                          1





                          $begingroup$

                          You must look into the actual definition of a vector space... In this this case, since the set of solutions of $AX=0$ is a subset of $mathbb{R}^n$, you just need to check that it is closed with respect to the sum of vectors and to the multiplication by scalars. Take $u,v in mathbb{R}^n$ such that $Au = Av = 0$ (basically any two solutions of $AX=0$) and $alpha in mathbb{R}$. If you prove that $u+v$ and $alpha u$ are also solutions of $AX=0$, you have your result. This is very straightforward to check:
                          $$ A(u+v)= Au + Av = 0 + 0 = 0$$



                          $$A(alpha u) = alpha (Au) = alpha cdot 0 = 0$$






                          share|cite|improve this answer









                          $endgroup$



                          You must look into the actual definition of a vector space... In this this case, since the set of solutions of $AX=0$ is a subset of $mathbb{R}^n$, you just need to check that it is closed with respect to the sum of vectors and to the multiplication by scalars. Take $u,v in mathbb{R}^n$ such that $Au = Av = 0$ (basically any two solutions of $AX=0$) and $alpha in mathbb{R}$. If you prove that $u+v$ and $alpha u$ are also solutions of $AX=0$, you have your result. This is very straightforward to check:
                          $$ A(u+v)= Au + Av = 0 + 0 = 0$$



                          $$A(alpha u) = alpha (Au) = alpha cdot 0 = 0$$







                          share|cite|improve this answer












                          share|cite|improve this answer



                          share|cite|improve this answer










                          answered Jan 29 at 19:06









                          PierreCarrePierreCarre

                          1,665212




                          1,665212






























                              draft saved

                              draft discarded




















































                              Thanks for contributing an answer to Mathematics Stack Exchange!


                              • Please be sure to answer the question. Provide details and share your research!

                              But avoid



                              • Asking for help, clarification, or responding to other answers.

                              • Making statements based on opinion; back them up with references or personal experience.


                              Use MathJax to format equations. MathJax reference.


                              To learn more, see our tips on writing great answers.




                              draft saved


                              draft discarded














                              StackExchange.ready(
                              function () {
                              StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3092576%2fvector-space-as-the-set-of-solutions-of-matrix-equation-ax-o%23new-answer', 'question_page');
                              }
                              );

                              Post as a guest















                              Required, but never shown





















































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown

































                              Required, but never shown














                              Required, but never shown












                              Required, but never shown







                              Required, but never shown







                              Popular posts from this blog

                              'app-layout' is not a known element: how to share Component with different Modules

                              android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

                              WPF add header to Image with URL pettitions [duplicate]